Google defends AI search results after being asked to clue in pizza

Last week, Google released its AI search results for millions of users to tinker with. The goal was to provide a better search experience. Instead, the AI ​​returned all sorts of weird results, like telling people to put glue on pizza to help the cheese stick. Eat rocks.

Now, a Company blog post Liz Reed, Google’s head of search, released a statement Thursday that the tech company blames “data gaps” for wrong results, and that people generate odd questions, and doubled down by saying that AI results lead to “higher satisfaction” with search. AI overviews are typically “hallucinatory;” Reed argues no. They sometimes misunderstand what is already on the Internet.

“There’s nothing like millions of people using a feature with multiple novel searches,” he writes. “We’ve also seen silly new searches aimed at producing false results.” He also rightly notes that there are “huge numbers of fake screenshots” of AI overviews circulating online.

First, “Which mammal has the most bones?” I would like to point out that A fair question if you spend any time with an anxious toddler. Second, Google opted in millions of people to this feature, which caused a lot of backlash and prompted articles explaining how to improve the way to disable it.

Reed’s blog also explains how Google adjusts AI overlays when it comes to “nonsense” queries and satires. It is good to address this, because many people were surprised that a well-known satirical website onion And Reddit user “fucksmith” wasn’t filtered out of the AI ​​results in the first place.

Part of Reid’s blog compares AI overviews to another long-standing search feature called featured snippets, which highlight information from a related web page without using AI. According to Reed, the “accuracy rate” for featured snippets is “on par” with AI overviews.

See also  Tesla slashes prices sharply as demand increases

If Google is going to compete, it needs to move fast. But it needs to maintain user trust. It can be difficult to get Elmer’s Clue back after the AI ​​outlines tell it to eat it.

Leave a Reply

Your email address will not be published. Required fields are marked *