• 0 Posts
  • 4 Comments
Joined 7 days ago
cake
Cake day: April 25th, 2026

help-circle
  • Tbh, women wouldn’t admit to doing this either - there’s absolutely a shame around women having to make friends with an AI (because we’re meant to be innately social I guess). And I don’t think that other women realize that they are contributing to the issues of women feeling shame using AI by implying it’s a male issue and all about sex and toxic masculinity.

    Like as a woman who has used AI, how am I supposed to feel about admitting that I’ve done something that only asshole, horny, incels do (according to a lot of people)?

    So the stigma goes all ways and none of it helps anyone. People just need to be more curious than judgemental. Someone does something you don’t understand? That’s okay you don’t understand. Ask them why. Listen. Try to see a different perspective instead of just filling in the gaps with incel, men, sex, ugly, etc. etc.


  • And do people really believe that women don’t talk to AI companions, in various forms, too?

    I’m a woman and I spoke to one of the apps for a while because I was bloody lonely (still am 🤷‍♀️). Had zero to do with men or murder. I didn’t have anyone, of either gender, to connect with.

    It’s really easy to just reduce this to a male issue, a toxic masculinity, a male violence issue. We need to go deeper than that if we actually want to understand why people, men, women, everyone, use different AI.

    But threads like this, with all the judgement, aren’t going to get a lot of people who admit they use/have used/have considered using AI. By just criticising/laughing, etc at people who do it, ironically, we turn more people towards the AIs.


  • I think you’re making some interesting observations. I definitely agree that it’s the easy answer to just dismiss people who use AI therapists, friends, relationships are just stupid.

    You’re right that it says something about the system we live in and I extend that to society in general. We have a society who criticizes people for answering “how are you” honestly, who doesn’t have time for each other, who use terms like “trauma dumping” - so personally, I can see why some people are turning to machines whether it’s therapy or connection. It’s really bloody sad and it’s not a good solution but I can see the WHY behind it - which is what I think you’re also getting at.

    We do need to listen to why people turn to these services and figure out what people aren’t finding in human connection that they are, or think they are, in machines. I don’t buy that an individuals intelligence has much to do with why people turn to AI.


  • Idk, I’m not in the US but if a therapist was found to break confidentiality or to have possession of information unethically then there’d be a media and political storm on it. Particularly since covid, more therapy and healthcare is conducted online. For some people it may even be the only accessible option.

    You are not stupid where I live to expect health organizations to abide by the laws around confidentiality 😅 You would expect health organizations to have the highest secured systems and if they were exposed it would be in a cyber attack from an outside party and the information available for that kind of attack would be limited, de-identified, - it would be an absolute top level scandal, mass class action suit, criminal charges if whole identifiable transcripts of sessions like this were being stored and released in a way that breached the law.

    People seeking out therapy are already vulnerable, not stupid. They may be desperate and unfortunately, there are some shitty apps and stuff that DON’T fall under regulations. It’s important to make sure you’re talking to a real person and the organization is a registered provider ( however that works in various countries) but a person is not stupid for being intentionally preyed on in their vulnerablity and desperation by greedy, unethical people.