If you have recently done something, you probably noticed a useful looking AI summary popping up before the rest of your search results, such as this:
Please pay attention to the subtle predictive small text at the bottom, which reads: “AI answers can include errors.”
Seems Convenient, but unfortunately, AI is predisposed to “hallucination” (also known as creating things). These hallucinations happen because the chats built on large language models or LLM “learn” by ingesting huge amounts of text. However, AI does not really know things or does not understand text in the same way that people do. Instead, he uses an algorithm to predict which words are most likely to come then based on all the data in his training set. According to the New York Times, testing has discovered more models of AI hallucinate at speeds of up to 79%.
Suwakeram / Getty Images
The current AI models are also not good at distinguishing the difference between jokes and legal information that has shamefully made Google’s Ai Gemini to offer glue such as pizza, topping shortly after adding the results in 2024.
Recently, on the website known as Twitter, people share some of the most ridiculous hallucinations of Gemini who have come across the Google search results, many in response to this viral tweet:
Here are 15 of the best/the oldest:
1It’s not good to know things like how much an adult weighs:
2.And is deeply unskilled to be your therapist:
3This is as good in solving words with words as it is a 15-year-old stones.
Related: I hate to say it but I’m almost sure half of Americans won’t be able to pass this extremely easy citizenship test
4.No, seriously:
5.And there are no great spaghetti recipes.
6.Sometimes it gives you the right answer for all the wrong reasons, as in the case where the person probably wanted to know if Marlon Brando had been in the 1995 movie. HeatS
7.However, this can be really, really good at improvisation, because it is a hell of yes, and.
Related: 19 things that society glorifies that are actually terrible and we must stop pretending to be otherwise
8.Almost makes me want to see this imaginary episode of FRASIER… almost.
9.Sometimes I just don’t know what to say.
10.Like, even with the right facts, it can come to the right wrong answer.
11.It’s almost impressive how wrong it can be.
12.You definitely do not use it to look for concert tickets.
13.Do not accept your airport security tips.
14.And remember that it is never good to leave a dog in a hot car.
15.Finally, please, please do not eat rocks.
There is still no way of Google users to exclude these AI searching users, but there are several ways to get around them. One method is to add -AI to the end of your search request like this:
Some people swear that adding a curse to your search request will prevent AI summons, but it doesn’t work for me:
Finally, if you are on a desktop computer, selecting the “Web” from the menu just below the search bar will show you the best results from the AI network network:
Do you have a terrible AI not to share? Post screen photo in the comments below:
Also on the Internet finds: 15 items on the Facebook market you will want, from the depths of your soul, you can see
Also on the Internet finds: People profess their absolute most darkest stories of “revenge served” and this is a delicious fun
Also on the Internet finds: 51 people who quickly discovered why their funny unknown partner was unmarried before meeting them