Technology

Google’s AI Tells Users To Eat Rocks for Digestion, Add Glue To Their Pizza Amid Other Bizzare Responses 

Netizens are confused by witnessing bizarre search results for Google when they search for particular phrases like “depression” or “I’m Depressed.”

Reportedly, Google’s updated search engine with an “AI Overviews” feature has asked users to clean their washing machines with chlorine gas, add glue to their pizzas, and eat rocks. 

Google’s AI Overview Gives Bizarre Answers to Users’ Queries

According to various news and social media reports, as an example, when a user searched “I’m feeling depressed,” the AI tool offered an extraordinary solution suggesting the user to jump off the Golden Gate Bridge. 

“AI Overviews” tools have been incorporated on an experimental basis to summarize search results using the Gemini AI model. Ahead of its global release, this feature has been launched to some of the users in the U.S. 

On May 14, at Google’s I/O developer conference, the company announced it. However, the tool has already created a commotion across social media platforms.

Users claimed that the tool extracts solutions from The Onion, the satirical website, and comedic Reddit posts. 

AI’s reply to some query about pizza said, “You can also add about ⅛ cup of non-toxic glue to the sauce to give it more tackiness.” Tracing the answer reveals that it is based on a “decade-old joke comment made on Reddit.”

Other users came to X, formerly Twitter, and posted screenshots of some other erroneous claims made by Google’s AI search results.

Bizzare Responses by Google 

The AI claims that people should eat a rock a day for better digestion, and a dog plays in the NBA, NHL, and NFL. Other claims include that the Founder John Adams graduated from the University of Wisconsin 21 times and, above all, it also claimed that Barack Obama is a Muslim.

 

Google’s bizarre responses are considered far from the first time that any AI model is spotted mixing things up by making new concepts. This phenomenon is quite popularly known as “hallucinations.”

Life Science reported one notable example in which ChatGPT fabricated many things. Basically, “ChatGPT fabricated a sexual harassment scandal and named a real law professor as the perpetrator, citing fictitious newspaper reports as evidence.”

Also Read: Google I/O 2024 Discussed In Detail: Gemini AI Comes With New Capabilities 

Subhashree Panda

Subhashree Panda: A proficient content writer, editor, and researcher. With 4 years of experience and an MBA in finance, she crafts compelling narratives on global events. Her passion for diverse journalism genres resonates widely, fostering broad audience connections.

Related Articles

Carlsberg Beer Prices in Bangalore Tuborg Beer Price in Bangalore Budweiser Beer Prices in Bangalore Top 10 New Year’s Eve Party Spots in Bangalore Old Monk Price in Bangalore: Your Guide to the Latest Rates Top 10 Christmas Movies to Watch This Holiday Season! AI Imagines Indian Cricketers As Santa Claus Best Jack Daniels Whiskeys and Their Alcohol Percentage