Archived link: https://archive.ph/Vjl1M

Here’s a nice little distraction from your workday: Head to Google, type in any made-up phrase, add the word “meaning,” and search. Behold! Google’s AI Overviews will not only confirm that your gibberish is a real saying, it will also tell you what it means and how it was derived.

This is genuinely fun, and you can find lots of examples on social media. In the world of AI Overviews, “a loose dog won’t surf” is “a playful way of saying that something is not likely to happen or that something is not going to work out.” The invented phrase “wired is as wired does” is an idiom that means “someone’s behavior or characteristics are a direct result of their inherent nature or ‘wiring,’ much like a computer’s function is determined by its physical connections.”

It all sounds perfectly plausible, delivered with unwavering confidence. Google even provides reference links in some cases, giving the response an added sheen of authority. It’s also wrong, at least in the sense that the overview creates the impression that these are common phrases and not a bunch of random words thrown together. And while it’s silly that AI Overviews thinks “never throw a poodle at a pig” is a proverb with a biblical derivation, it’s also a tidy encapsulation of where generative AI still falls short.

  • Melvin_Ferd@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    2 hours ago

    I mean are you asking it if there is a history of an idiom existing or just what the idiom could mean?

  • ParadoxSeahorse@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    14 hours ago

    The saying “you can’t cross over a duck’s river” is a play on words, suggesting that it’s difficult to cross a river that is already filled with ducks. It’s not a literal statement about rivers and ducks, but rather an idiom or idiom-like phrase used to express the idea that something is difficult or impossible to achieve due to the presence of obstacles or challenges.

    I used the word “origin” instead of “meaning”, which didn’t seem to work.

  • webadict@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    19 hours ago

    The saying “you can’t butter a fly” is an idiom expressing that someone or something is too difficult to influence or manipulate. It’s rooted in the idea that butterflies, with their delicate nature, are virtually impossible to convince to do anything against their will, let alone “butter” them in a literal sense.

    • Deebster@infosec.pubOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      17 hours ago

      This is a great example - it kinda makes sense if you skim read it but butterflies have nothing to do with butter, just like hotdogs have nothing to do with dogs.

    • surewhynotlem@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      17 hours ago

      No, that phrase means “this situation is hopeless because the person is incapable of change”. You can’t turn a fly into a butterfly, no matter how hard you try.

  • atrielienz@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    18 hours ago

    I for one will not be putting any gibberish into Google’s AI for any reason. I don’t find it fun. I find it annoying and have taken steps to avoid it completely on purpose. I don’t understand these articles that want to throw shade at AI LLM’s by suggesting their viewers go use the LLM’s which only helps the companies that own the LLM’s.

    Like. Yes. We have established that LLM’s will give misinformation and create slop because all their data sets are tainted. Do we need to continue to further this nonsense?

  • exixx@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    1 day ago

    Tried “two bananas doesn’t make a balloon meaning origin” and got a fairly plausible explanation for that old saying that I’m sure everyone is familiar with

  • NOT_RICK@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    1 day ago

    I just tested it on Bing too, for shits and giggles

    you can’t butter the whole world’s bread meaning

    The phrase “you can’t butter the whole world’s bread” means that one cannot have everything

  • Nurse_Robot@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    4
    ·
    1 day ago

    Didn’t work for me. A lot of these ‘gotcha’ AI moments seem to only work for a small percentage of users, before being noticed and fixed. Not including the more frequent examples that are just outright lies, but get upvoted anyway because ‘AI bad’

    • Ironfacebuster@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      1 day ago

      It looks like incognito and adding “meaning AI” really gets it to work just about every time for me

      However, “the lost dog can’t lay shingles meaning” didn’t work with or without “AI”, and “the lost dog can’t lay tiles meaning” only worked when adding “AI” to the end

      So it’s a gamble on how gibberish you can make it I guess

    • Deebster@infosec.pubOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 day ago

      I found that trying “some-nonsense-phrase meaning” won’t always trigger the idiom interpretation, but you can often change it to something more saying-like.

      I also found that trying in incognito mode had better results, so perhaps it’s also affected by your settings. Maybe it’s regional as well, or based on your search result. And, as AI’s non-deterministic, you can’t expect it to always work.

    • GraniteM@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      20 hours ago

      Now I’ll never know what people mean when they say “those cupcakes won’t fill a sauna”!

  • knexcar@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    16 hours ago

    Honestly, I’m kind of impressed it’s able to analyze seemingly random phrases like that. It means its thinking and not just regurgitating facts. Because someday, such a phrase could exist in the future and AI wouldn’t need to wait for it to become mainstream.

    • ilinamorato@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 hours ago

      It’s not thinking. It’s just spicy autocomplete; having ingested most of the web, it “knows” that what follows a question about the meaning of a phrase is usually the definition and etymology of that phrase; there aren’t many examples online of anyone asking for the definition of a phrase and being told “that doesn’t exist, it’s not a real thing.” So it does some frequency analysis (actually it’s probably more correct to say that it is frequency analysis) and decides what the most likely words to come after your question are, based on everything it’s been trained on.

      But it doesn’t actually know or think anything. It just keeps giving you the next expected word until it meets its parameters.