Get the latest tech news

"42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm." A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you.


Don't use AI as your doctor, says European research on Microsoft Copilot.

Google has been mocked for its odd and error-laden AI search results over the past year, with the initial rollout recommending that users eat rocks or add glue to pizza. (Image credit: Kevin Okemwa | Windows Central)The research paper details how Microsoft Copilot specifically was asked to field answers to the 10 most popular medical questions in America, about 50 of the most prescribed drugs and medicines. When it comes to potentially dangerous medical advice, conspiracy theories, political misinformation, or anything in between — there's a non-trivial chance that Microsoft's AI summaries could be responsible for causing serious harm at some point if they aren't careful.

Get the Android app

Or read this on r/technology

Read more on:

Photo of death

death

Photo of harm

harm

Photo of medical advice

medical advice

Related news:

News photo

Scientists asked Bing Copilot - Microsoft's search engine and chatbot - questions about commonly prescribed drugs. In terms of potential harm to patients, 42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm.

News photo

Shield AI’s founder on death, drones in Ukraine, and the AI weapon ‘no one wants’

News photo

Life, death, and retirement