Get the latest tech news
ChatGPT goes ‘off the rails’ with gibberish answers
The startup's signature product, ChatGPT, began returning gibberish, nonsensical outputs to users yesterday afternoon.
The startup’s signature product, ChatGPT, began returning gibberish, nonsensical outputs to users yesterday afternoon, Tuesday, February 20, 2024, and many took to X (formerly Twitter) to complain. Some of ChatGPT’s outputs mixed Spanish and English unintelligibly, while others made up words or repeated phrases over and over, despite the large language model (LLM)-powered chatbot not being asked to do so. One astute user compared the seemingly random strings of disconnected words to the unsettling “weird horror” extraterrestrial graffiti from Jeff VanderMeer’s seminal 2014 novel Annihilation, and it’s true from the perspective of this reader of both that they bare a similarly eerie quality of an inhuman intelligence that apepars out of whack or illogical.
Or read this on Venture Beat