I find AI to be a better conversation partner than humans in most circumstances. It’s not perfect but it’s knowledgeable about pretty much every topic and it’s always fully engaged and attentive. Most people, by contrast, aren’t very interesting and most interesting people are busy. Of course I would prefer to talk to someone who was also subjectively experiencing and enjoying the conversation, but I can get a lot out of a conversation even without that.
It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.
people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.
this isn’t any different, it’s just a AI version of it. it’s still mostly imaginative fantasy at the end of the day, and it’s a form of escapism from the real world.
the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who ‘really understood her’ and her struggles with the stillbirth trauma. it’s all entirely a fiction in her head, but it’s a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.
Let’s say I give you a discord link and tell you that half the people are bots and half aren’t. Realistically, LLMs are at a level where you won’t be able to tell which is which.
So what then. You are only having a conversation half the time but you can’t point out when that is? Feels a bit hollow.
This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.
back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.
Ya I get what you mean, I’m just saying that to say there’s a difference, you would have to be able to see that difference in a blind test.
I understand they have limits, but so do regular people. You don’t need to be an expert on a subject to hold a conversation about it.
They aren’t intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.
It’s definitely hollow but I get why people are getting caught up in it.
I find AI to be a better conversation partner than humans in most circumstances. It’s not perfect but it’s knowledgeable about pretty much every topic and it’s always fully engaged and attentive. Most people, by contrast, aren’t very interesting and most interesting people are busy. Of course I would prefer to talk to someone who was also subjectively experiencing and enjoying the conversation, but I can get a lot out of a conversation even without that.
It does not understand what its saying. Its fine to summarize some searches or bring forth known best practices but I would not call what it does conversation.
people fall in love with fictional characters in books and other media, mostly as a product of their imagined interactions with the character.
this isn’t any different, it’s just a AI version of it. it’s still mostly imaginative fantasy at the end of the day, and it’s a form of escapism from the real world.
the new yorker had an article about it where a housewife basically had AI boyfriend who was her version of Geralt from the witcher, and was using it to cope with the fact she had a stillbirth from 5 years earlier and her AI Geralt was the only one who ‘really understood her’ and her struggles with the stillbirth trauma. it’s all entirely a fiction in her head, but it’s a mechanism for self-soothing, that is relatively harmless compared to her say, doing drugs or divorcing her husband or other methods of coping that might manifest. it was basically fan-fiction with an AI agent helping her co-write.
this I understand. I mean as a video game or a laugh. sure. but its not conversation.
Kind of feels like semantics.
Let’s say I give you a discord link and tell you that half the people are bots and half aren’t. Realistically, LLMs are at a level where you won’t be able to tell which is which.
So what then. You are only having a conversation half the time but you can’t point out when that is? Feels a bit hollow.
This probably happens on Lemmy. You probably have interactions that you qualify as conversations in your head but that are with bots.
People are in for a rude awakening when we discover that ‘next token prediction’ is what intelligence means after all.
back and forths sure. only some attain the level of a conversation. yeah bots exist but social media is not a subsitute for the real world. I would not call it semantics. its my experience talking/chatting with humans and with ai. My big thing with folk who want to get an idea of llm limits is to engage it with a topic you are very familiar with or can see the effects immediately like playing a video game. I have been using it with balders gate and its been. interesting.
Ya I get what you mean, I’m just saying that to say there’s a difference, you would have to be able to see that difference in a blind test.
I understand they have limits, but so do regular people. You don’t need to be an expert on a subject to hold a conversation about it.
They aren’t intelligent and all that, and make the stupidest mistakes but they can more or less hold a convo just as well as the average rando on the internet.
It’s definitely hollow but I get why people are getting caught up in it.