Google’s AI Overviews sometimes acts like a lost man who won’t ask for directions: It would rather confidently make a mistake than admit it doesn’t know something.
We know this because folks online have noticed you can ask Google about any faux idiom — any random, nonsense saying you make up — and Google AI Overviews will often prescribe its meaning. That’s not exactly surprising, as AI has shown a penchant for either hallucinating or inventing stuff in an effort to provide answers with insufficient data.
In the case of made-up idioms, it’s kind of funny to see how Google’s AI responds to idiotic sayings like “You can’t lick a badger twice.” On X, SEO expert Lily Ray dubbed the phenomenon “AI-splaining.”
Mashable Light Speed
I tested the “make up an idiom” trend, too. One phrase — “don’t give me homemade ketchup and tell me it’s the good stuff” — got the response “AI Overview is not available for this search.” However, my next made up phrase — “you can’t shake hands with an old bear” — got a response. Apparently Google’s AI thinks this phrase suggests the “old bear” is an untrustworthy person.

Credit: Screenshot: Google
In this instance, Google AI Overview’s penchant for making stuff up is kind of funny. In other instances — say, getting the NFL’s overtime rules wrong — it can be relatively harmless. And when it first launched, it was telling folks to eat rocks and put glue on pizza. Other examples of AI hallucinations are less amusing. Keep in mind that Google warns users that AI Overviews can get facts wrong, though it remains at the top of many search results.
So, as the old, time-honored idiom goes: Be wary of search with AI, what you see may be a lie.
Topics
Artificial Intelligence
Google