Get the latest tech news
Man files complaint after ChatGPT said he killed his children
The Norwegian man says the chatbot's maker OpenAI should be fined over the completely inaccurate information.
Google's AI Gemini has also fallen foul of hallucination- last year it suggested sticking cheese to pizza using glue, and said geologists recommend humans eat one rock per day. Noyb told the BBC Mr Holmen had made a number of searches that day, including putting his brother's name into the chatbot and it produced "multiple different stories that were all incorrect." They also acknowledged the previous searches could have influenced the answer about his children, but said large language models are a "black box" and OpenAI "doesn't reply to access requests, which makes it impossible to find out more about what exact data is in the system."
Or read this on BBC News