The Hidden Risks of AI-Coding: Hallucinated Dependencies
It’s the end of a long workday, and you’re stuck in error messages. The AI assistant suggests a package that sounds perfect. You install it, and unknowingly open the door to a supply chain attack.
The increased use of AI-assisted coding introduces numerous new vulnerabilities. One of them is hallucinated code dependencies. A study by Arxiv, which investigated 576,000 generated Python and JavaScript code samples, discovered that 20% of the suggested code packages didn’t exist.
Hallucinated dependencies become a vulnerability when attackers exploit common package names generated by AI models. This is called slopsquatting, and it is a new hacking technique that takes advantage of the fact that many hallucinated package names are often repeated across similar prompts. Meaning that when a user downloads a suggested package that doesn’t exist, it unknowingly downloads the malicious package instead.
This is where bifrost makes a difference. While you can't prevent AI from hallucinating code, you can stop that hallucination from becoming an attack vector. By monitoring your application in real-time, bifrost detects unexpected behaviour and actively blocks malicious packages before they can do any damage.
Are you using AI-assisted coding in your workflow? Let’s talk!
🔗 Read the full report here: https://arxiv.org/abs/2406.10279