A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

  • minorkeys@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 month ago

    So somewhere they feel safe to do so. Says something pretty fucked up about our culture that men don’t feel safe to open up anywhere. And no, it’s not their own fault.

  • kingthrillgore@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    1 month ago

    Of course men will go to an AI for their problems, they can’t fathom going to a woman for honest advice.

    And as a result, they gaslight themselves with a worse version of ELIZA.

    • NostraDavid@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 days ago

      they can’t fathom going to a woman for honest advice.

      Honest advice may not be good advice. I could tell a person “go kill yourself”, and be VERY honest about it. Yet it’s not good advice, now is it?

  • chunes@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    And it’s awesome. Men aren’t allowed by others to show weakness. AI therapy genuinely helps a lot.

    • prof@infosec.pub
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      edit-2
      1 month ago

      Or it gets them into a negative feedback loop since AI hardly ever tries to contradict you.

      But yeah. At least they’re opening up to someone/something.