• anomnom@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    2
    ·
    13 hours ago

    It’s not AI. It’s LLMs that don’t actually think in any meaningful way. They just repeat what they have ingested. And was most mathematically likely.

    That’s why imma pessimist about LLMs doing anything truly revolutionary. They’re another productivity tool to solve problems that shouldn’t exist in the first place and middle-managment loves it for the same fucking reason.

    • JackbyDev@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 hours ago

      What you’re calling AI has somewhat shifted to being called AGI. Either way, the ship has long since set sail and LLMs are lumped under the category of AI. That’s what it’s called. Usage dictates meaning. It’s not an endorsement of the technology. The same way the computer AIs in games are called AI even if they aren’t “real AI.”

    • mojofrododojo@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 hours ago

      yup. a roided up eliza isn’t going to synthesize anything new. they can do some tasks, but it’s most certainly not artificial intelligence. and chaining a bunch of eliza’s together isn’t going to make them smarter (claw etc.,), much less make them reliable and useful.