Of all the voice LLMs Gemini is the worst. You’ll ask a factual question, “How many Grammys has U2 won?” It will give an answer based off its training then ask, “Would you also like to know about blah blah blah.”
No. I would not. Stop asking follow-up questions. I simply was curious about this one thing while listening to the radio.
The fact people fall in love with or depend on these auto complete bots is crazy. Not to mention, they’re wrong as often as they’re right.
I watched someone try to use Gemini through Android Auto to navigate somewhere a few days ago. It was kind of amazing.
He told it to navigate to a place, and it found a match on a different continent, refused to navigate to it and then rambled about two other irrelevant places it wanted him to go to instead for a while before it finally shut up and he could try again. It didn’t work the second time either.
Ye olde Google Assistant, when told “navigate to <place name>”, will open maps and search for <place name>.
I am a person. You are an object. Do as you’re fucking told, I’m not interested in listening to you trying to fake having an opinion.
A lot of these LLMs heap praise on the user - some more blatantly than others - whether it’s warranted or not.
Those most susceptible tend to be the ones who don’t regularly receive that recognition in their day-to-day lives, so they become infatuated with this “AI” that treats them nicer than they’re accustomed to.
Of all the voice LLMs Gemini is the worst. You’ll ask a factual question, “How many Grammys has U2 won?” It will give an answer based off its training then ask, “Would you also like to know about blah blah blah.”
No. I would not. Stop asking follow-up questions. I simply was curious about this one thing while listening to the radio.
The fact people fall in love with or depend on these auto complete bots is crazy. Not to mention, they’re wrong as often as they’re right.
Correct answer: who cares.
I watched someone try to use Gemini through Android Auto to navigate somewhere a few days ago. It was kind of amazing.
He told it to navigate to a place, and it found a match on a different continent, refused to navigate to it and then rambled about two other irrelevant places it wanted him to go to instead for a while before it finally shut up and he could try again. It didn’t work the second time either.
Ye olde Google Assistant, when told “navigate to <place name>”, will open maps and search for <place name>.
I am a person. You are an object. Do as you’re fucking told, I’m not interested in listening to you trying to fake having an opinion.
The fake opinion is to steer you towards some marketing crap.
I mean some people are already essentially married to their car so this doesn‘t seem too far fachet. We are silly monkeys.
A lot of these LLMs heap praise on the user - some more blatantly than others - whether it’s warranted or not.
Those most susceptible tend to be the ones who don’t regularly receive that recognition in their day-to-day lives, so they become infatuated with this “AI” that treats them nicer than they’re accustomed to.
So you’re saying the world just needs a little more love ❤️
Well really it needs a lot more, but yes. That would, unironically, fix a lot of problems.
“In 0.2 miles, turn right after Waffle House. Also, you smell nice and have a huge dick.”
A certain portion of “dumb” is neccessary to fall for this automated crap.
deleted by creator