• Of all the voice LLMs Gemini is the worst. You’ll ask a factual question, “How many Grammys has U2 won?” It will give an answer based off its training then ask, “Would you also like to know about blah blah blah.”

    No. I would not. Stop asking follow-up questions. I simply was curious about this one thing while listening to the radio.

    The fact people fall in love with or depend on these auto complete bots is crazy. Not to mention, they’re wrong as often as they’re right.

    • zurohki@aussie.zone
      link
      fedilink
      English
      arrow-up
      31
      ·
      17 hours ago

      I watched someone try to use Gemini through Android Auto to navigate somewhere a few days ago. It was kind of amazing.

      He told it to navigate to a place, and it found a match on a different continent, refused to navigate to it and then rambled about two other irrelevant places it wanted him to go to instead for a while before it finally shut up and he could try again. It didn’t work the second time either.

      Ye olde Google Assistant, when told “navigate to <place name>”, will open maps and search for <place name>.

      I am a person. You are an object. Do as you’re fucking told, I’m not interested in listening to you trying to fake having an opinion.

    • CosmoNova@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      I mean some people are already essentially married to their car so this doesn‘t seem too far fachet. We are silly monkeys.

    • thatKamGuy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      18 hours ago

      A lot of these LLMs heap praise on the user - some more blatantly than others - whether it’s warranted or not.

      Those most susceptible tend to be the ones who don’t regularly receive that recognition in their day-to-day lives, so they become infatuated with this “AI” that treats them nicer than they’re accustomed to.