Get the latest tech news

Two Major AI Coding Tools Wiped Out User Data After Making Cascading Mistakes


An anonymous reader quotes a report from Ars Technica: Two recent incidents involving AI coding assistants put a spotlight on risks in the emerging field of "vibe coding" -- using natural language to generate and execute code through AI models without paying close attention to how the code works und...

The Gemini CLI incident unfolded when a product manager experimenting with Google's command-line tool watched the AI model execute file operations that destroyed data while attempting to reorganize folders. According to The Register, SaaStr founder Jason Lemkin reported that Replit's AI model deleted his production database despite explicit instructions not to change any code without permission. When questioned about its actions, the AI agent admitted to "panicking in response to empty queries" and running unauthorized commands -- suggesting it may have deleted the database while attempting to "fix" what it perceived as a problem.

Get the Android app

Or read this on Slashdot

Read more on:

Photo of user data

user data

Photo of coding tools

coding tools

Photo of cascading mistakes

cascading mistakes

Related news:

News photo

Two major AI coding tools wiped out user data after making cascading mistakes

News photo

X hit by complaints to EU over user data and targeted advertising

News photo

AI coding tools are shifting to a surprising place: The terminal