Get the latest tech news

Man files complaint after ChatGPT said he killed his children


The Norwegian man says the chatbot's maker OpenAI should be fined over the completely inaccurate information.

Google's AI Gemini has also fallen foul of hallucination- last year it suggested sticking cheese to pizza using glue, and said geologists recommend humans eat one rock per day. Noyb told the BBC Mr Holmen had made a number of searches that day, including putting his brother's name into the chatbot and it produced "multiple different stories that were all incorrect." They also acknowledged the previous searches could have influenced the answer about his children, but said large language models are a "black box" and OpenAI "doesn't reply to access requests, which makes it impossible to find out more about what exact data is in the system."

Get the Android app

Or read this on BBC News

Read more on:

Photo of ChatGPT

ChatGPT

Photo of Children

Children

Photo of Man files complaint

Man files complaint

Related news:

News photo

ChatGPT reportedly accused innocent man of murdering his children

News photo

ChatGPT hit with privacy complaint over defamatory hallucinations

News photo

ChatGPT could be set as 'default' assistant on Android phones