A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.
“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.
So somewhere they feel safe to do so. Says something pretty fucked up about our culture that men don’t feel safe to open up anywhere. And no, it’s not their own fault.
Of course men will go to an AI for their problems, they can’t fathom going to a woman for honest advice.
And as a result, they gaslight themselves with a worse version of ELIZA.
they can’t fathom going to a woman for honest advice.
Honest advice may not be good advice. I could tell a person “go kill yourself”, and be VERY honest about it. Yet it’s not good advice, now is it?
If I had nothin else going on I’d probably do it
And it’s awesome. Men aren’t allowed by others to show weakness. AI therapy genuinely helps a lot.
Or it gets them into a negative feedback loop since AI hardly ever tries to contradict you.
But yeah. At least they’re opening up to someone/something.