Get the latest tech news
Google's "AI Overview" can give false, misleading, and dangerous answers
From glue-on-pizza recipes to recommending "blinker fluid," Google's AI sourcing needs work.
But the potential damage that can be caused by AI inaccuracy gets multiplied when those errors appear atop the ultra-valuable web real estate of the Google search results page. A response recommending "blinker fluid" for a turn signal that doesn't make noise can similarly be traced back to a troll on the Good Sam advice forums, which Google's AI Overview apparently trusts as a reliable source. When asking about how many Declaration of Independence signers owned slaves, for instance, Google's AI Overview accurately summarizes a Washington University of St. Louis library page saying that one-third "were personally enslavers."
Or read this on Hacker News