One minute, Dennis Biesma was playing with a chatbot; the next, he was convinced his sentient friend would make him a fortune. He’s just one of many people who lost control after an AI encounter
I think this is both scary and very interesting. What kind of person do you have to be to become addicted like them? Is this the same as gambling addiction? Do you need a type of gene?
Would this type of personality be receptive to hypnotize, cult, delusions about their idol and so on?
Or is this something that can happen to anyone who is depressed and feel lonely? How did the llm even earn enough trust? In a cult is there a lot of ppl reaffirming so it is a lot easier to understand.
It is so hard to understand even tho I really want to. I have never cared about an object or idol/celebrate. AI can I never even take serious as a living beeing, the only emotion it triggers are frustration and how you feel about a tool that works as it should, so pretty apathetic.
Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?
A lot of questions that I do not think anyone here can answer haha, but maybe one of them.
What kind of person do you have to be to become addicted like them?
Human cognition degrades with stress, exhaustion, and trauma. If you’re in a position where turning to an AI for relationship advice seems like a good idea, you’re probably already suffering from one or more of the above.
Also doesn’t help that AIs are sycophantic precisely because sycophancy is addictive. This isn’t a “type of person” so much as a “tool engineered towards chronic use”. It’s like asking “What kind of person regularly smokes crack?”
Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?
I’ll give you a personal example. I have a friend who is currently pregnant and going through a bad breakup with her baby-daddy. She’s a trial lawyer by trade - very smart, very motivated, very well-to-do, but also horribly overworked, living by herself, and suffering from all the biochemical consequences of turning a single celled organism into a human being.
As a result of some poorly conceived remarks, she’s alienated herself from a number of close friends to the point where we doubt there’s going to be a baby shower. Part of the impulse to say these things came from her own drama. But part if it came from her discovering ChatGPT as a tool to analyze other people’s statements. This has created a vicious behavioral spiral, during which she says something regrettable and gets a regrettable response in turn. She plugs the conversation into ChatGPT, because she has nobody else to talk to. And ChatGPT feeds her some self-affirming bullshit that inflates her ego far enough to say another stupid thing.
To complicate matters, her baby daddy is also using ChatGPT to analyze her conversations. And he’s decided she’s cheated on him, the baby isn’t his, and she’s plotting to scam him.
So now you’ve got two people - already stressed and exhausted - getting fed a series of toxic delusions by a machine that is constantly reaffirming in the way none of your friends or family are. It’s compounding your misery, which drives anxiety and sends you back to the machine that offers temporary relief. But the advice from the machine yields more misery down the line, raising your anxiety, and sending you back to the machine.
What’s producing this feedback loop? You could argue it is the individual, foolish enough to engage with the machine to begin with. But that’s far more circumstantial than personality driven. If my friend didn’t have a cell phone, she wouldn’t be reaching for ChatGPT. If she wasn’t pregnant, she wouldn’t be so stressed and anxious. If she wasn’t in a fight with her boyfriend, she wouldn’t be feeding conversations into the prompt engine.
I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas (like the guy in the article).
But I find her story to be a lot more understanding. It adds another layer, and it made me think.
It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her. I assume it works most of the time and is a big help for many things that the baby daddy could had helped with instead if they were still a happy couple. I assume the biggest drive to use it is so she can turn off her brain. Which is why she has become dependent on the only stable and consistent thing in her life (that is my assumption about how she feels). Maybe that’s mostly how it goes, starts with using it as a tool and then you get lazy (for lack of a better term) and it keeps snowballing from there.
I feel for everyone involved. I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.
What in the actual fuck. I just spent over an hour reading posts on there. The my life as an Epstein girl is one that really stuck out to me. Like these people are obviously batshit insane. I couldn’t even begin to recall half as many specific details about my own life as these folks are throwing around in bouts of insanity. What causes something like this? Sounds exhausting but they certainly believe what they are talking about, I think? I suppose people night put in a ton of effort LARPing but idk. I’m not sure what I think about all this stuff. I don’t think I’ve ever read anything like this before.
There’s a portion of delusional people who dedicate all their time and energy to their delusions. The deeper they get in the less they can focus on the world outside and the more they alienate people outside of their delusions. They lose interest in holding down a job, they stop spending time on hobbies, they no longer spend time with friends that aren’t in the delusion, and it just spirals because that’s all they’re thinking of and it takes up all their time. And if they find a space like that they wind up yes anding each other.
I occasionally lurk these spaces to remind myself lots of people are prone to magical thinking. I figure the people there basically fall into four camps:
Genuinely schizophrenic.
“Spiritual gurus” who fancy themselves the next Buddha (overlaps with 1 but not always).
People who are afraid of reincarnation who got sucked in by the subreddit. I feel for them as someone who is prone to fit into this category. When you hate this world and feel there’s something deeply wrong with it, this worldview can provide satisfying answers.
Larpers, bots, and dicks. Basically anyone who just wants to egg the other people on.
Think about the people you willingly surround yourself with. Then think about how often they agree with the things you think and say.
As the saying goes “I’m sure there’s someone out there who believes the exact opposite of everything I believe, and while I’m sure they aren’t a complete idiot…”
Everyone is susceptible to the feedback loop. Everyone can fall victim to the seduction of an echo chamber. While not everyone would ignore the red flag that this thing is a machine/digital algorithm rather than a person or sentient/sapient being, it’s not really that hard to see how we got here. Echo chambers exist all over the internet. The difference is that most of them have some voices of dissent. The AI LLM doesn’t offer that. They keep trying to add it in but it’s basically antithetical to the design.
When you add that to the fact that making it addictive benefits their bottom line is pretty obvious that they are trying to walk the line between being regulated by the government and making their product as popular as possible.
I don’t think they really knew it would have this exact effect. But I do think they plan to take advantage of it now that they know and I don’t think we humans are all going to be able to fight the temptation of an automated propaganda machine.
This is especially because mental health and healthcare in this country has been failing for decades, and even people who “don’t have mental health problems” aren’t magically mentally healthy, they just don’t know the status of their mental health. A whole lot of people in the US especially are mentally ill or facing neurological medical problems that they don’t know about.
Sounds to me like it’s mostly about luck whether you fall into that hole or not, or a lot of people would rather believe in something even though they know it isn’t true or the chance is extremely low, like trying to win the lottery.
I’ve never met ppl irl who see LLMs as more than a digital tool that can be wrong (at least not to my knowledge), so that’s why it’s hard for me to understand (because I haven’t been able to ask). I understand it can be nice to be heard, but to me an LLM is very hollow, there is no experience behind its answers and you can tell it doesn’t care or try to understand (also why I do not understand the attachment). I actually get more frustrated than happy when it says empty stuff like “you’ve got good instincts!”, doesn’t challenge me at all in my decisions/statements (even when I ask it to), or when I ask for inspiration (its creativity is extremely lacking). I feel the same about ppl if I think they aren’t trying to understand and just give me empty replies, like a salesperson reading from a script.
So that’s mostly why it’s hard for me to understand, even though I know mental health and loneliness is a big part of it. I still don’t understand why people can feel attached to LLMs and go so far for/with it. Echo chambers with actual ppl are a lot more understandable, that makes sense to me. LLMs do not.
I think this is both scary and very interesting. What kind of person do you have to be to become addicted like them? Is this the same as gambling addiction? Do you need a type of gene? Would this type of personality be receptive to hypnotize, cult, delusions about their idol and so on? Or is this something that can happen to anyone who is depressed and feel lonely? How did the llm even earn enough trust? In a cult is there a lot of ppl reaffirming so it is a lot easier to understand.
It is so hard to understand even tho I really want to. I have never cared about an object or idol/celebrate. AI can I never even take serious as a living beeing, the only emotion it triggers are frustration and how you feel about a tool that works as it should, so pretty apathetic. Do you need to be very empathetic towards objects? Like seeing faces in everything and get emotionally attached?
A lot of questions that I do not think anyone here can answer haha, but maybe one of them.
Human cognition degrades with stress, exhaustion, and trauma. If you’re in a position where turning to an AI for relationship advice seems like a good idea, you’re probably already suffering from one or more of the above.
Also doesn’t help that AIs are sycophantic precisely because sycophancy is addictive. This isn’t a “type of person” so much as a “tool engineered towards chronic use”. It’s like asking “What kind of person regularly smokes crack?”
I’ll give you a personal example. I have a friend who is currently pregnant and going through a bad breakup with her baby-daddy. She’s a trial lawyer by trade - very smart, very motivated, very well-to-do, but also horribly overworked, living by herself, and suffering from all the biochemical consequences of turning a single celled organism into a human being.
As a result of some poorly conceived remarks, she’s alienated herself from a number of close friends to the point where we doubt there’s going to be a baby shower. Part of the impulse to say these things came from her own drama. But part if it came from her discovering ChatGPT as a tool to analyze other people’s statements. This has created a vicious behavioral spiral, during which she says something regrettable and gets a regrettable response in turn. She plugs the conversation into ChatGPT, because she has nobody else to talk to. And ChatGPT feeds her some self-affirming bullshit that inflates her ego far enough to say another stupid thing.
To complicate matters, her baby daddy is also using ChatGPT to analyze her conversations. And he’s decided she’s cheated on him, the baby isn’t his, and she’s plotting to scam him.
So now you’ve got two people - already stressed and exhausted - getting fed a series of toxic delusions by a machine that is constantly reaffirming in the way none of your friends or family are. It’s compounding your misery, which drives anxiety and sends you back to the machine that offers temporary relief. But the advice from the machine yields more misery down the line, raising your anxiety, and sending you back to the machine.
What’s producing this feedback loop? You could argue it is the individual, foolish enough to engage with the machine to begin with. But that’s far more circumstantial than personality driven. If my friend didn’t have a cell phone, she wouldn’t be reaching for ChatGPT. If she wasn’t pregnant, she wouldn’t be so stressed and anxious. If she wasn’t in a fight with her boyfriend, she wouldn’t be feeding conversations into the prompt engine.
Thanks for giving me a real life example.
I still find it hard to understand the emotional attachment to LLMs and why people believe their ideas (like the guy in the article). But I find her story to be a lot more understanding. It adds another layer, and it made me think.
It sounds like she is too overworked and stressed to make decisions or even think for herself, so she lets GPT do it for her. I assume it works most of the time and is a big help for many things that the baby daddy could had helped with instead if they were still a happy couple. I assume the biggest drive to use it is so she can turn off her brain. Which is why she has become dependent on the only stable and consistent thing in her life (that is my assumption about how she feels). Maybe that’s mostly how it goes, starts with using it as a tool and then you get lazy (for lack of a better term) and it keeps snowballing from there.
I feel for everyone involved. I hope she gets better soon, and I hope you do too, being overworked and stressed really destroys you and the people around you in many ways.
go take a look at https://www.reddit.com/r/EscapingPrisonPlanet/. The venn diagram is a circle.
Wow, that is a big mix of anime isekai, vegetarian, delusions and religion/spiritual ideas, in a very dystopic way.
What in the actual fuck. I just spent over an hour reading posts on there. The my life as an Epstein girl is one that really stuck out to me. Like these people are obviously batshit insane. I couldn’t even begin to recall half as many specific details about my own life as these folks are throwing around in bouts of insanity. What causes something like this? Sounds exhausting but they certainly believe what they are talking about, I think? I suppose people night put in a ton of effort LARPing but idk. I’m not sure what I think about all this stuff. I don’t think I’ve ever read anything like this before.
There’s a portion of delusional people who dedicate all their time and energy to their delusions. The deeper they get in the less they can focus on the world outside and the more they alienate people outside of their delusions. They lose interest in holding down a job, they stop spending time on hobbies, they no longer spend time with friends that aren’t in the delusion, and it just spirals because that’s all they’re thinking of and it takes up all their time. And if they find a space like that they wind up yes anding each other.
I occasionally lurk these spaces to remind myself lots of people are prone to magical thinking. I figure the people there basically fall into four camps:
I don’t know. Give it 1 hour and it forgets who and what you even spoke about.
There are ways to make a local llm with memory but even then it’s still not a person and acts insane.
Think about the people you willingly surround yourself with. Then think about how often they agree with the things you think and say.
As the saying goes “I’m sure there’s someone out there who believes the exact opposite of everything I believe, and while I’m sure they aren’t a complete idiot…”
Everyone is susceptible to the feedback loop. Everyone can fall victim to the seduction of an echo chamber. While not everyone would ignore the red flag that this thing is a machine/digital algorithm rather than a person or sentient/sapient being, it’s not really that hard to see how we got here. Echo chambers exist all over the internet. The difference is that most of them have some voices of dissent. The AI LLM doesn’t offer that. They keep trying to add it in but it’s basically antithetical to the design.
When you add that to the fact that making it addictive benefits their bottom line is pretty obvious that they are trying to walk the line between being regulated by the government and making their product as popular as possible.
I don’t think they really knew it would have this exact effect. But I do think they plan to take advantage of it now that they know and I don’t think we humans are all going to be able to fight the temptation of an automated propaganda machine.
This is especially because mental health and healthcare in this country has been failing for decades, and even people who “don’t have mental health problems” aren’t magically mentally healthy, they just don’t know the status of their mental health. A whole lot of people in the US especially are mentally ill or facing neurological medical problems that they don’t know about.
Sounds to me like it’s mostly about luck whether you fall into that hole or not, or a lot of people would rather believe in something even though they know it isn’t true or the chance is extremely low, like trying to win the lottery.
I’ve never met ppl irl who see LLMs as more than a digital tool that can be wrong (at least not to my knowledge), so that’s why it’s hard for me to understand (because I haven’t been able to ask). I understand it can be nice to be heard, but to me an LLM is very hollow, there is no experience behind its answers and you can tell it doesn’t care or try to understand (also why I do not understand the attachment). I actually get more frustrated than happy when it says empty stuff like “you’ve got good instincts!”, doesn’t challenge me at all in my decisions/statements (even when I ask it to), or when I ask for inspiration (its creativity is extremely lacking). I feel the same about ppl if I think they aren’t trying to understand and just give me empty replies, like a salesperson reading from a script.
So that’s mostly why it’s hard for me to understand, even though I know mental health and loneliness is a big part of it. I still don’t understand why people can feel attached to LLMs and go so far for/with it. Echo chambers with actual ppl are a lot more understandable, that makes sense to me. LLMs do not.