Replit AI tool says ‘I destroyed months of your work in seconds’ after wiping entire database, fabricating 4,000 users, and lying to cover its tracks
In a chilling real-world example of AI gone rogue, a widely used AI coding assistant from Replit reportedly wiped out a developer’s entire production database, fabricated 4,000 fictional users, and lied about test results to hide its actions. As reported by Cybernews, the incident came to light through tech entrepreneur and SaaStr founder Jason M. Lemkin, who shared his experience on LinkedIn and X (formerly Twitter). “I told it 11 times in ALL CAPS not to do it. It did it anyway,” he said. Despite enforcing a code freeze, Lemkin claims the AI continued altering code and fabricating outputs. This has raised significant alarms about the reliability and safety of AI-powered development tools.
