Avatar

Non-practicing intellectual.

@xiaq / xiaq.tumblr.com

Xiaq on A03. EL Massey for published work.

Pinned

FAQ

Hi! I’m Xiaq or E.L. Massey IRL.

You can find me on:

Instagram Here

AO3 Here

Substack Here

My published works are Like Real People Do (2022), Like You’ve Nothing Left to Prove (2023), All Hail the Underdogs (2023) and Free from Falling (2024) which started nearly a decade ago as OMGCP fic. The latter two are standalone books, the first two must be read together.

If you’re looking for downloaded fic versions of LRPD and AHTU, I can’t send them to you for contract reasons, but I CAN direct you here and here where you may find folks who are willing to assist :)

5% of royalties from my first book, LRPD, goes to a queer charity each month. If you want to suggest a charity for future donations, you can do so via chat or any of the above listed ways to contact me.

Deacon is my 13 yr old, retired, service dog. He's a Belgian Mal and I do not recommend Mals, as a breed, for service work. He's a special boy. Feel free to message me if you want to talk about SD stuff.

If you're looking for student stories/tales from my time in academia, the tags are #shitmystudentssay or #shitmystudentswrite.

If you’re looking for DIY/renovation content, the tag is #DIY

For posts about my partner, the tag is #B

If you’re looking for paramotor/flying content the tag is #paramotor or #x learns to fly.

I will not share or donate to any fundraisers unless I know the beneficiary, or someone I know can confirm they know the beneficiary.

Breakaway book 5 is a go. First chapter will be posted next week!

It features Theo (Alex’s trans younger brother seeking sanctuary) and Dev (a rookie with a traumatic past who is living with Eli and Alex for his first season with the Hounds). After an antagonistic introduction, the boys realize they have more in common than they initially thought (and maybe they should kiss about it??) but they’ve got a lot of mutual pining to do before admitting anything like feelings (not to mention other concerns like hockey-related subterfuge, blackmail, and parents who probably shouldn’t be).

Avatar
Reblogged
Anonymous asked:

since you’re in the tech world can you explain why giving your personal information to a chatbot/using chatgpt as a therapist or friend is a bad idea? my roommate recently revealed to me that she tells chatgpt about her entire day and worries and i’m trying to convince her to Not do that (unsuccessfully). since you actually work in tech do you have any ideas for how i can explain the risks and issues?

Oh boy. This will be a fast pass since I’m on my lunch break but here we go.

  1. OpenAI’s CEO Sam Altman has explicitly said you should not use ChatGPT as a therapist/friend. If the CEO is telling you “don’t do this,” don’t do this. Source
  2. The primary reason he cites is that there’s no legal privilege. No Dr/patient confidentiality. Altman even said that, in the event of a lawsuit or legal inquiry, Open AI would produce the entirety of people’s conversations. Every word. There is zero privacy (and that’s aside from the fact that your data is being actively mined).
  3. Most chatbots are built to encourage engagement, prolong conversation (so you give them more content to mine), and be as agreeable as possible. This means they may inadvertently encourage someone who is delusional, reaffirm incorrect assumptions/statements that a human would call out, or even agree that a person should self-harm or kill themselves with no accountability. There are multiple cases now of people who have committed suicide or were hospitalized after interacting with chatbots (and at least one legal case now related to this). source, source, source, source
  4. Chatbots are only as good as the LLMs they’re built upon. So it’s unsurprising that they may show stigma against certain kinds of substance and/or mental health issues and may fail to recognize suicidal ideation. Source
  5. Finally, The American Psychological Association is saying don’t do it. All chatbots are not unilaterally harmful, and there are even some studies in which folks are actively trying to create therapy bots that do not have these pitfalls with positive initial results, but ChatGPT is not one of them. Source
  6. And that’s not to mention the environmental impacts of using generative AI in general. Source I get it. I understand the desire for a free (or low cost) therapist that’s available 24/7 without judgement. But while some might argue that it’s better than no therapy at all, as someone who works with AI/LLMs, it will be a cold day in hell before I ever use ChatGPT as a therapist.
Avatar

Tl;dr

Never trust a thing that pretends to have empathy while being incapable of it.

Anonymous asked:

since you’re in the tech world can you explain why giving your personal information to a chatbot/using chatgpt as a therapist or friend is a bad idea? my roommate recently revealed to me that she tells chatgpt about her entire day and worries and i’m trying to convince her to Not do that (unsuccessfully). since you actually work in tech do you have any ideas for how i can explain the risks and issues?

Oh boy. This will be a fast pass since I’m on my lunch break but here we go.

  1. OpenAI’s CEO Sam Altman has explicitly said you should not use ChatGPT as a therapist/friend. If the CEO is telling you “don’t do this,” don’t do this. Source
  2. The primary reason he cites is that there’s no legal privilege. No Dr/patient confidentiality. Altman even said that, in the event of a lawsuit or legal inquiry, Open AI would produce the entirety of people’s conversations. Every word. There is zero privacy (and that’s aside from the fact that your data is being actively mined).
  3. Most chatbots are built to encourage engagement, prolong conversation (so you give them more content to mine), and be as agreeable as possible. This means they may inadvertently encourage someone who is delusional, reaffirm incorrect assumptions/statements that a human would call out, or even agree that a person should self-harm or kill themselves with no accountability. There are multiple cases now of people who have committed suicide or were hospitalized after interacting with chatbots (and at least one legal case now related to this). source, source, source, source
  4. Chatbots are only as good as the LLMs they’re built upon. So it’s unsurprising that they may show stigma against certain kinds of substance and/or mental health issues and may fail to recognize suicidal ideation. Source
  5. Finally, The American Psychological Association is saying don’t do it. All chatbots are not unilaterally harmful, and there are even some studies in which folks are actively trying to create therapy bots that do not have these pitfalls with positive initial results, but ChatGPT is not one of them. Source
  6. And that’s not to mention the environmental impacts of using generative AI in general. Source I get it. I understand the desire for a free (or low cost) therapist that’s available 24/7 without judgement. But while some might argue that it’s better than no therapy at all, as someone who works with AI/LLMs, it will be a cold day in hell before I ever use ChatGPT as a therapist.

Account Exec: (after an initial meeting with a prospect) Okay, I'm going to set up a demo with the customer.

Me: No, don't do that. I can't demo for them until we get the specialist team involved and they get me access to X Y connectors. Wait until they've responded about availability. AE: I went ahead and scheduled it anyway. We can move it, if needed. Me: Okay, I spoke to the specialist team. The environment won't be ready until a week after you scheduled the meeting. You need to move it. AE: But the customer is expecting to meet then!! They've already forwarded the meeting to several important people internally!! You need to escalate this!! Me:

Avatar
Reblogged avoliot

my corner store guy is a 50 year old man who's my best friend in the world and recently he was like "you're too pretty to be single I have some nephews you should meet. very handsome!" and I was like "a niece might be more up my alley" and he just got more excited and said "ah even better! I was overselling my nephews but my nieces are very beautiful"

OP the tags!!

Avatar
Reblogged tucsonhorse

tumblr users love reading. you literally stopped for this post just because it has words in it

this is one of my favorite bits about tumblr

the users seem to actually prefer text posts to anything else, and treat it as a chore to play a video especially with sound

Avatar
Reblogged

My reps all gave me applause points (our internal reward system) for a job well done in 2025. I immediately converted them to Barnes and Noble gift cards and went on a book buying spree the likes of which I have never previously experienced. Such dopamine.

And then I pre-ordered a bunch of books as well so they’ll be nice little surprises through the year as they show up.

Just got the notification they’ve shipped and now I’m all pakidge???about it.

Avatar
Reblogged

We went to the Blossoms of Light event at the Denver Botanic gardens and when we saw a greenhouse-related opportunity, we took it. ♥️

This is the second year we’ve gone and will definitely go again next year.

Avatar
Reblogged dykealloy

I love reading fanfics where one character is tagged as jealous, but their partner is unlovable to anyone but them. Like.. calm down, sweetheart, no one wants your man. We're still trying to figure out why you want your man.

My reps all gave me applause points (our internal reward system) for a job well done in 2025. I immediately converted them to Barnes and Noble gift cards and went on a book buying spree the likes of which I have never previously experienced. Such dopamine.

And then I pre-ordered a bunch of books as well so they’ll be nice little surprises through the year as they show up.

Sponsored

You are using an unsupported browser and things might not work as intended. Please make sure you're using the latest version of Chrome, Firefox, Safari, or Edge.