• Devolution@lemmy.world
    link
    fedilink
    English
    arrow-up
    77
    arrow-down
    26
    ·
    12 hours ago

    This is more sad and pathetic than anything. But this is the result of toxic masculinity.

    • 🍉 DrRedOctopus 🐙🍉@lemmy.world
      link
      fedilink
      English
      arrow-up
      90
      arrow-down
      2
      ·
      11 hours ago

      It is extremely sad. and it isn’t just a toxic masculinity thing (maybe only for porn bots). we are so atomised and isolated.

      I remember when GPT came out, told it about my projects and it responded as if it cared. I knew ot was bs, and in retrospect it was sad and pathetic, but I genuinely cried at seeing text directed to me that was nice.

      I’m in a better place now, but we as a society are way too atomised and isolated.

      • Beans@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 hours ago

        Yeah, I think saying “toxic masculinity” and moving on like it’s these guys’ fault they’re isolated is a large part of the issue. While I don’t recommend befriending every single lonely guy out there, it won’t kill people to listen or care about others.

        Saying it’s “you’re” fault and absolving oneself of fault doesn’t do that. It just pushes someone else into more isolation. That’s how you end up with guys talking to porn bots: because no one will listen to them. That’s how you get incels following Andrew Tate or Nick Fuentes: people called out their “toxic masculinity,” but weren’t willing to help, just protect themselves.

        While I get it that boundaries are a good defense against legitimate threats, as someone who was in this demographic, it literally took just one person being nice to me and now I’m not just some “nice guy” on Reddit (Now I’m a piece of shit on Lemmy). Now I’m married and can show incels I meet that there is a path forward where they aren’t lonely and they don’t have to listen to virgin wannabe rapists to learn how to be cool.

      • Malyca@lemmy.zip
        link
        fedilink
        English
        arrow-up
        4
        ·
        6 hours ago

        I’m too anxious to speak to a therapist but I was using it to comb through literature for my condition, it was so nice to me I cried lol. In the moment it almost feels like a person.

        • 🍉 DrRedOctopus 🐙🍉@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          5 hours ago

          yhea, it’s so counterproductive to criticize people who form parasocial relationships with a machine that was designed to be good at forming those relationships.

      • Devolution@lemmy.world
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        3
        ·
        11 hours ago

        Toxic masculinity is a cultural mindset. Men should not be talking about their feelings because it’s weak and “gay” says society.

        That’s what I’m going for.

        • TubularTittyFrog@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          9 hours ago

          trying talking about your feelings as a man and see how society reacts…

          spoiler: it won’t be pleasant.

          sort of like how these men in the article are talking about their feelings…

          • Hacksaw@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            52 minutes ago

            Yeah that’s what toxic masculinity is. People (men and women) hold toxic views of what a man should be, and punish men for staying from this ideal.

            You were a victim of toxic masculinity when you shared your feelings and were then victimised because of it. The people you shared your feelings with were toxic assholes.

      • wirehead@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        9
        ·
        11 hours ago

        To riff off of Margret Atwood, men go to AI chatbots because they won’t laugh at them. Women go to AI chatbots because they won’t kill them.

        • ikt@aussie.zone
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          2
          ·
          10 hours ago

          did you read the article? this doesn’t seem related at all

          • TubularTittyFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            4
            ·
            9 hours ago

            No, they are just here to spout cliche gender war bullshit about how men are awful for existing.

            and if you asked them about women on male violence they’d deny it exists.

      • TubularTittyFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        8 hours ago

        Yeah. I have become painfully aware of this the past few years. People’s obessive use of AI and social media has distorted their real life interactions to be far less substantail than they used to be.

        Which is why so many people, even who are very social, are so lonely. We have created a society that does not create substantial connections anymore, and obsesses over trivialities, and endless repeats and broadcasts them as fundamental truths.

        I have noticed that online, and in IRL, nobody asks each other questions anymore. What they do, is make accusations. And it’s miserable and draining to be constantly accused of stuff. I feel like this shift started around 2021.

        Back in 2018 I could meet a stranger and they would be like ‘oh where are you from? oh cool, what was it like there, I have not been!’

        now it’s like ‘i bet you are from x, oh you’re not? well you SEEM LIKE a person from x. oh you are from y? THAT’S WEIRD. I haven’t been there but i bet it’s weird because you are weird.’ Or they try to tell me that I can’t be from y, because they KNOW i am from x. It’s so bizarre. Increasingly the strangers I meet basically tell me that they know the TRUTH about me… even as I tell them that what they are saying isn’t true.

        I basically can’t have conversations anymore, at least like I used to. I used to be able to sit there for 20-30m and talk about a single book I read to someone, and they’d ask me all about it and I’d ask them about a book they like. Now they just jump down my throat or lecture me and never ask me any questions, and switch to another topic after like a few minutes and say dismissive stuff about how books are outdated and dumb. Or even if they do like to read, they get all bent out of shape that I don’t read the same type of stuff as they do.

        Same with movies, same with hobbies, same with my job or my family or other stuff that I used to be able to connect with people over. Used to be a nice back and forth, now it’d dodging bullets and if you don’t give the ‘right’ answer they get angry and dismiss you as a bad person.

        And on the flip side… AI gives these people what they want. It just parrots back to them what they want to hear about how wonderful and great they are and how everything they do is amazing and valid and their life is so hard… which is precisely what another human being is NOT going to give you…

        • IAmYouButYouDontKnowYet@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 hours ago

          I feel like it’s a social psyop… To help forward all this crazy shit happening. It’s clear Ai is an “arms race”.

          It seems like the psyop is to make life shitty and then promote some magical fix (Ai) that’s going to save us, while it further leashes us to submission and rewrites history and current narration of what humanity is.

          • TubularTittyFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            7 hours ago

            No it’s not.

            It’s just the mental version of obesity crisis. It’s people choosing the easy and unhealthy option because it’s cheaper and readily available, than the far more difficult and more costly option of eating well and exercise.

              • TubularTittyFrog@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                7 hours ago

                People choose to be dumb. Just like they choose to be lazy.

                no psyop is required. biology doesn’t like making an effort if it doesn’t have to do so. you can see lots of non-human examples of this as well.

                getting human beings outside of their default biological impulses to be lazy and not think… takes years of training and work. hence why so few people are able to achieve it. and you can always default back to it if you don’t maintain the effort consistently

    • tacosanonymous@mander.xyz
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      9
      ·
      12 hours ago

      No one wants to actually listen to them. Instead of doing some self-reflection, they force a computer to “hear” their misplaced rage.

      • Devolution@lemmy.world
        link
        fedilink
        English
        arrow-up
        16
        arrow-down
        3
        ·
        11 hours ago

        Not every guy is that way. Some just really are pathetic in the sense that they have no one to talk to. Others are like what you said.

  • TommySoda@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    2
    ·
    11 hours ago

    I tried one just for shits a giggles awhile back to see if there is any merit to the widespread use of them. The only way you’d find these even remotely realistic or interesting is if you’ve never had any kind of sexual encounter with a real person before, whether in person or through text. After about five minutes of “chatting” with one of these bots it started to respond like half baked fan fiction that didn’t understand the basics of sex or even anatomy. The cadence is very predictable and it tends to repeat the same wording and phrasing constantly. If you have real world experience with people, it just feels like a generic chatbot.

    In my opinion, this is more proof that these people need to interact with real humans. If these chat bots seem at all human to you, you need to interact with more actual humans.

      • TommySoda@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 hours ago

        Pretty much, yeah. It’s like reading fan fiction and assuming that’s how real people talk to each other. Similar to watching porn and assuming that’s how sex works when in reality sex is clunky and often times gross.

    • HubertManne@piefed.social
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      10 hours ago

      I really don’t understand how anyone could want to chat with bots in general. Do people lack the ability to appreciate the genuine. It explains how you get people like trump. Who wants that kind of interaction?

      • LordMayor@piefed.social
        link
        fedilink
        English
        arrow-up
        9
        ·
        8 hours ago

        There are people that suffer from isolation, anxiety, depression, trauma or a host of other issues over which they have no control or support structures to address their problems. Of course, these bots aren’t a solution but they are accessible. It’s no wonder why people would use them.

        They deserve sympathy not condescension.

        • HubertManne@piefed.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          5 hours ago

          heck I have those but I still don’t understand how anyone could want to chat with bots and its not conensation.

      • TommySoda@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        10 hours ago

        The issue arises when you don’t have anyone to talk to. Having something to talk, even though it’s not a real person, can be enticing to sate the need to communicate with people. The problem is that people that don’t have a lot of real life experience in communication fall into the trap of thinking it’s better because it’s always agreeable and “listens” better than normal people. To me that sounds like someone that has difficulties with oversharing and has poor social skills. What these people should actually be doing in order to feel more satisfied socially is to work on their social skills instead of only talk to chatbots that can’t say no. If the types of relationships people have with chatbots were translated into human relationships most people would consider them toxic. And how many people do you know that for some reason seek out and always end up in toxic relationships?

      • TubularTittyFrog@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        8 hours ago

        most of modern life isn’t genuine. and yes, people don’t like it when they encounter it.

        they love artifice. they love their biases being confirmed, they love their egos being flattered.

      • ArbitraryValue@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        10 hours ago

        I find AI to be a better conversation partner than humans in most circumstances. It’s not perfect but it’s knowledgeable about pretty much every topic and it’s always fully engaged and attentive. Most people, by contrast, aren’t very interesting and most interesting people are busy. Of course I would prefer to talk to someone who was also subjectively experiencing and enjoying the conversation, but I can get a lot out of a conversation even without that.

        • HubertManne@piefed.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 hours ago

          It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.

          • TubularTittyFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 hours ago

            people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.

            this isn’t any different, it’s just a AI version of it. it’s still mostly imaginative fantasy at the end of the day, and it’s a form of escapism from the real world.

            the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who ‘really understood her’ and her struggles with the stillbirth trauma. it’s all entirely a fiction in her head, but it’s a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.

          • Grimy@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            8 hours ago

            Kind of feels like semantics.

            Let’s say I give you a discord link and tell you that half the people are bots and half aren’t. Realistically, LLMs are at a level where you won’t be able to tell which is which.

            So what then. You are only having a conversation half the time but you can’t point out when that is? Feels a bit hollow.

            This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.

            • HubertManne@piefed.social
              link
              fedilink
              English
              arrow-up
              1
              ·
              5 hours ago

              back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.

              • Grimy@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                1
                ·
                edit-2
                2 hours ago

                Ya I get what you mean, I’m just saying that to say there’s a difference, you would have to be able to see that difference in a blind test.

                I understand they have limits, but so do regular people. You don’t need to be an expert on a subject to hold a conversation about it.

                They aren’t intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.

                It’s definitely hollow but I get why people are getting caught up in it.