Avatar

Non-practicing intellectual.

@xiaq / xiaq.tumblr.com

Xiaq on A03. EL Massey for published work.
Avatar
Reblogged
Anonymous asked:

since you’re in the tech world can you explain why giving your personal information to a chatbot/using chatgpt as a therapist or friend is a bad idea? my roommate recently revealed to me that she tells chatgpt about her entire day and worries and i’m trying to convince her to Not do that (unsuccessfully). since you actually work in tech do you have any ideas for how i can explain the risks and issues?

Oh boy. This will be a fast pass since I’m on my lunch break but here we go.

  1. OpenAI’s CEO Sam Altman has explicitly said you should not use ChatGPT as a therapist/friend. If the CEO is telling you “don’t do this,” don’t do this. Source
  2. The primary reason he cites is that there’s no legal privilege. No Dr/patient confidentiality. Altman even said that, in the event of a lawsuit or legal inquiry, Open AI would produce the entirety of people’s conversations. Every word. There is zero privacy (and that’s aside from the fact that your data is being actively mined).
  3. Most chatbots are built to encourage engagement, prolong conversation (so you give them more content to mine), and be as agreeable as possible. This means they may inadvertently encourage someone who is delusional, reaffirm incorrect assumptions/statements that a human would call out, or even agree that a person should self-harm or kill themselves with no accountability. There are multiple cases now of people who have committed suicide or were hospitalized after interacting with chatbots (and at least one legal case now related to this). source, source, source, source
  4. Chatbots are only as good as the LLMs they’re built upon. So it’s unsurprising that they may show stigma against certain kinds of substance and/or mental health issues and may fail to recognize suicidal ideation. Source
  5. Finally, The American Psychological Association is saying don’t do it. All chatbots are not unilaterally harmful, and there are even some studies in which folks are actively trying to create therapy bots that do not have these pitfalls with positive initial results, but ChatGPT is not one of them. Source
  6. And that’s not to mention the environmental impacts of using generative AI in general. Source I get it. I understand the desire for a free (or low cost) therapist that’s available 24/7 without judgement. But while some might argue that it’s better than no therapy at all, as someone who works with AI/LLMs, it will be a cold day in hell before I ever use ChatGPT as a therapist.
Avatar

Tl;dr

Never trust a thing that pretends to have empathy while being incapable of it.

Avatar
Reblogged

My new boss: “Everyone come to the team meeting with a surprising story about something you’ve done in the past. Something no one would expect of you!”

Me: Googling the statute of limitation for felonies in Texas

I won.

noooo you can't leave us hanging like this

we demand politely request to know the (alleged) felony!!

Mom, don’t read this.

My publisher said I should engage more with social media so I turned this story into a reel. Please go like it so I can retreat, hissing, back into my writing cave.

Hey friends! If you’re coming to Romance Con this year, I’ll be there! (Deacon will be too, provided he stays in good health post liver fiasco).

Let me know if you’re attending and stay tuned for my next DIY project: free swag, signing table decor, and possibly a freestanding photo op backdrop, if I can figure out how to make something that can break down and fit in a suitcase.❤️

Sponsored

You are using an unsupported browser and things might not work as intended. Please make sure you're using the latest version of Chrome, Firefox, Safari, or Edge.