Non-practicing intellectual.

1.5M ratings
277k ratings

See, that’s what the app is perfect for.

Sounds perfect Wahhhh, I don’t wanna

FAQ

Hi! I’m Xiaq or E.L. Massey IRL.

image

You can find me on:

Instagram Here

AO3 Here

Substack Here

[email protected]

My published works are Like Real People Do (2022), Like You’ve Nothing Left to Prove (2023), All Hail the Underdogs (2023) and Free from Falling (2024) which started nearly a decade ago as OMGCP fic. The latter two are standalone books, the first two must be read together.

image

I’m currently working on an adult fantasy original fiction series as well as Breakaway Book Five and Star Trek fanfic on AO3.

If you’re looking for downloaded fic versions of LRPD and AHTU, I can’t send them to you for contract reasons, but I CAN direct you here and here where you may find folks who are willing to assist :)

5% of royalties from my first book, LRPD, goes to a queer charity each month. If you want to suggest a charity for future donations, you can do so via chat or any of the above listed ways to contact me.

Deacon is my 13 yr old, retired, service dog. He’s a Belgian Mal and I do not recommend Mals, as a breed, for service work. He’s a special boy. Feel free to message me if you want to talk about SD stuff.

If you’re looking for student stories/tales from my time in academia, the tags are #shitmystudentssay or #shitmystudentswrite.

If you’re looking for DIY/renovation content, the tag is #DIY

For posts about my partner, the tag is #B

If you’re looking for paramotor/flying content the tag is #paramotor or #x learns to fly.

I will not share or donate to any fundraisers unless I know the beneficiary, or someone I know can confirm they know the beneficiary.

Pinned Post faq E.L. Massey lrpd lrpd adjacent lynltp ahtu fff mywriting fyi just makin a pinned post don't mind me
xiaq

Anonymous asked:

since you’re in the tech world can you explain why giving your personal information to a chatbot/using chatgpt as a therapist or friend is a bad idea? my roommate recently revealed to me that she tells chatgpt about her entire day and worries and i’m trying to convince her to Not do that (unsuccessfully). since you actually work in tech do you have any ideas for how i can explain the risks and issues?

xiaq answered:

Oh boy. This will be a fast pass since I’m on my lunch break but here we go.

  1. OpenAI’s CEO Sam Altman has explicitly said you should not use ChatGPT as a therapist/friend. If the CEO is telling you “don’t do this,” don’t do this. Source
  2. The primary reason he cites is that there’s no legal privilege. No Dr/patient confidentiality. Altman even said that, in the event of a lawsuit or legal inquiry, Open AI would produce the entirety of people’s conversations. Every word. There is zero privacy (and that’s aside from the fact that your data is being actively mined).
  3. Most chatbots are built to encourage engagement, prolong conversation (so you give them more content to mine), and be as agreeable as possible. This means they may inadvertently encourage someone who is delusional, reaffirm incorrect assumptions/statements that a human would call out, or even agree that a person should self-harm or kill themselves with no accountability. There are multiple cases now of people who have committed suicide or were hospitalized after interacting with chatbots (and at least one legal case now related to this). source, source, source, source
  4. Chatbots are only as good as the LLMs they’re built upon. So it’s unsurprising that they may show stigma against certain kinds of substance and/or mental health issues and may fail to recognize suicidal ideation. Source
  5. Finally, The American Psychological Association is saying don’t do it. All chatbots are not unilaterally harmful, and there are even some studies in which folks are actively trying to create therapy bots that do not have these pitfalls with positive initial results, but ChatGPT is not one of them. Source
  6. And that’s not to mention the environmental impacts of using generative AI in general. Source

    I get it. I understand the desire for a free (or low cost) therapist that’s available 24/7 without judgement. But while some might argue that it’s better than no therapy at all, as someone who works with AI/LLMs, it will be a cold day in hell before I ever use ChatGPT as a therapist.
xiaq

Tl;dr

Never trust a thing that pretends to have empathy while being incapable of it.

I mean not all ai it can be useful in certain forms but chat bot gen ai can die a quick and painful death ai generative ai hudson williams Instagram

Anonymous asked:

since you’re in the tech world can you explain why giving your personal information to a chatbot/using chatgpt as a therapist or friend is a bad idea? my roommate recently revealed to me that she tells chatgpt about her entire day and worries and i’m trying to convince her to Not do that (unsuccessfully). since you actually work in tech do you have any ideas for how i can explain the risks and issues?

Oh boy. This will be a fast pass since I’m on my lunch break but here we go.

  1. OpenAI’s CEO Sam Altman has explicitly said you should not use ChatGPT as a therapist/friend. If the CEO is telling you “don’t do this,” don’t do this. Source
  2. The primary reason he cites is that there’s no legal privilege. No Dr/patient confidentiality. Altman even said that, in the event of a lawsuit or legal inquiry, Open AI would produce the entirety of people’s conversations. Every word. There is zero privacy (and that’s aside from the fact that your data is being actively mined).
  3. Most chatbots are built to encourage engagement, prolong conversation (so you give them more content to mine), and be as agreeable as possible. This means they may inadvertently encourage someone who is delusional, reaffirm incorrect assumptions/statements that a human would call out, or even agree that a person should self-harm or kill themselves with no accountability. There are multiple cases now of people who have committed suicide or were hospitalized after interacting with chatbots (and at least one legal case now related to this). source, source, source, source
  4. Chatbots are only as good as the LLMs they’re built upon. So it’s unsurprising that they may show stigma against certain kinds of substance and/or mental health issues and may fail to recognize suicidal ideation. Source
  5. Finally, The American Psychological Association is saying don’t do it. All chatbots are not unilaterally harmful, and there are even some studies in which folks are actively trying to create therapy bots that do not have these pitfalls with positive initial results, but ChatGPT is not one of them. Source
  6. And that’s not to mention the environmental impacts of using generative AI in general. Source

    I get it. I understand the desire for a free (or low cost) therapist that’s available 24/7 without judgement. But while some might argue that it’s better than no therapy at all, as someone who works with AI/LLMs, it will be a cold day in hell before I ever use ChatGPT as a therapist.
heavy sigh Ai Generative ai Therapy psychology Answered asks Tech world things

Account Exec: (after an initial meeting with a prospect) Okay, I’m going to set up a demo with the customer.

Me: No, don’t do that. I can’t demo for them until we get the specialist team involved and they get me access to X Y connectors. Wait until they’ve responded about availability.

AE: I went ahead and scheduled it anyway. We can move it, if needed.

Me: Okay, I spoke to the specialist team. The environment won’t be ready until a week after you scheduled the meeting. You need to move it.

AE: But the customer is expecting to meet then!! They’ve already forwarded the meeting to several important people internally!! You need to escalate this!!

Me:

image
if only this could have been avoided like bro your poor planning does not constitute an emergency for me heavy sigh tech world things sure lets keep the original date I'm looking forward to seeing what you demo for them cuz I sure won't have anything
avoliot
going2hell4everythingbutbeingbi

my corner store guy is a 50 year old man who's my best friend in the world and recently he was like "you're too pretty to be single I have some nephews you should meet. very handsome!" and I was like "a niece might be more up my alley" and he just got more excited and said "ah even better! I was overselling my nephews but my nieces are very beautiful"

caustic-pixie

OP the tags!!

image
xiaq
xiaq

My reps all gave me applause points (our internal reward system) for a job well done in 2025. I immediately converted them to Barnes and Noble gift cards and went on a book buying spree the likes of which I have never previously experienced. Such dopamine.

image
image

And then I pre-ordered a bunch of books as well so they’ll be nice little surprises through the year as they show up.

image
xiaq

Just got the notification they’ve shipped and now I’m all pakidge???about it.

image

My reps all gave me applause points (our internal reward system) for a job well done in 2025. I immediately converted them to Barnes and Noble gift cards and went on a book buying spree the likes of which I have never previously experienced. Such dopamine.

image
image

And then I pre-ordered a bunch of books as well so they’ll be nice little surprises through the year as they show up.

image
mylife books reading the perks of corporate work