A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.
“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.
Of course men will go to an AI for their problems, they can’t fathom going to a woman for honest advice.
And as a result, they gaslight themselves with a worse version of ELIZA.
Honest advice may not be good advice. I could tell a person “go kill yourself”, and be VERY honest about it. Yet it’s not good advice, now is it?
If I had nothin else going on I’d probably do it