Get the latest tech news
"42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm." A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you.
Don't use AI as your doctor, says European research on Microsoft Copilot.
Google has been mocked for its odd and error-laden AI search results over the past year, with the initial rollout recommending that users eat rocks or add glue to pizza. (Image credit: Kevin Okemwa | Windows Central)The research paper details how Microsoft Copilot specifically was asked to field answers to the 10 most popular medical questions in America, about 50 of the most prescribed drugs and medicines. When it comes to potentially dangerous medical advice, conspiracy theories, political misinformation, or anything in between — there's a non-trivial chance that Microsoft's AI summaries could be responsible for causing serious harm at some point if they aren't careful.
Or read this on r/technology