Get the latest tech news

BBC finds significant inaccuracies in over 30% of AI-produced news summaries. Frequent problems include mangled quotes, editorializing, and outdated info.


Frequent problems include mangled quotes, editorializing, and outdated info.

Those journalists were asked to look for issues (either "significant" or merely "some") in the responses regarding accuracy, impartiality and editorialization, attribution, clarity, context, and fair representation of the sourced BBC article. In one cited summary, for instance, ChatGPT refers to Ismail Haniyeh as part of Hamas leadership despite his widely reported death last July. That said, the frequency and severity of significant problems cited in the BBC report are enough to suggest once again that you can't simply rely on LLMs to deliver accurate information.

Get the Android app

Or read this on r/technology

Read more on:

Photo of bbc

bbc

Photo of editorializing

editorializing

Photo of mangled quotes

mangled quotes

Related news:

News photo

AI Summaries Turn Real News Into Nonsense, BBC Finds

News photo

AI summaries turn real news into nonsense, BBC finds

News photo

AI chatbots unable to accurately summarise news, BBC finds