Get the latest tech news
AI Hallucinations Are Fueling a New Class of Supply Chain Attacks
Slopsquatting is a new supply chain threat where AI-assisted code generators recommend hallucinated packages that attackers register and weaponize.
One such risk is slopsquatting, a new term for a surprisingly effective type of software supply chain attack that emerges when LLMs “hallucinate” package names that don’t actually exist. The researchers tested 16 leading code-generation models, both commercial (like GPT-4 and GPT-3.5) and open source (like CodeLlama, DeepSeek, WizardCoder, and Mistral), generating a total of 576,000 Python and JavaScript code samples. One of the more encouraging findings was that some models, particularly GPT-4 Turbo and DeepSeek, were able to correctly identify hallucinated package names they had just generated, achieving over 75% accuracy in internal detection tests.
Or read this on Hacker News