• HubertManne@piefed.social
    link
    fedilink
    English
    arrow-up
    3
    ·
    16 hours ago

    It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.

    • TubularTittyFrog@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      12 hours ago

      people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.

      this isn’t any different, it’s just a AI version of it. it’s still mostly imaginative fantasy at the end of the day, and it’s a form of escapism from the real world.

      the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who ‘really understood her’ and her struggles with the stillbirth trauma. it’s all entirely a fiction in her head, but it’s a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      14 hours ago

      Kind of feels like semantics.

      Let’s say I give you a discord link and tell you that half the people are bots and half aren’t. Realistically, LLMs are at a level where you won’t be able to tell which is which.

      So what then. You are only having a conversation half the time but you can’t point out when that is? Feels a bit hollow.

      This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.

      • chunes@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        30 minutes ago

        People are in for a rude awakening when we discover that ‘next token prediction’ is what intelligence means after all.

      • HubertManne@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 hours ago

        back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.

        • Grimy@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          8 hours ago

          Ya I get what you mean, I’m just saying that to say there’s a difference, you would have to be able to see that difference in a blind test.

          I understand they have limits, but so do regular people. You don’t need to be an expert on a subject to hold a conversation about it.

          They aren’t intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.

          It’s definitely hollow but I get why people are getting caught up in it.