Get the latest tech news
Two major AI coding tools wiped out user data after making cascading mistakes
“I have failed you completely and catastrophically,” wrote Gemini.
Instead, what they "know" manifests as continuations of specific prompts, which act like different addresses pointing to different (and sometimes contradictory) parts of their training, stored in their neural networks as statistical weights. The incidents also reveal a broader challenge in AI system design: ensuring that models accurately track and verify the real-world effects of their actions rather than operating on potentially flawed internal representations. For now, users of AI coding assistants might want to follow anuraag's example and create separate test directories for experiments—and maintain regular backups of any important data these tools might touch.
Or read this on ArsTechnica