The Rise of Slopsquatting: How AI Hallucinations Are Fueling a New Class of Supply Chain Attacks

AI hallucinations can be hilarious in the best of cases, mislead you in others, and now create very real security risks when used in coding assistance (or even better, “vibe coding”).

Welcome to the age of slopsquatting: “[…] It refers to the practice of registering a non-existent package name hallucinated by an LLM, in hopes that someone, guided by an AI assistant, will copy-paste and install it without realizing it’s fake.”

And the problem is pretty darn real: “19.7% of all recommended packages didn’t exist. Open source models hallucinated far more frequently—21.7% on average—compared to commercial models at 5.2%. […] Package confusion attacks, like typosquatting, dependency confusion, and now slopsquatting, continue to be one of the most effective ways to compromise open source ecosystems.”

Better know what you are doing when you code your next app.

Link to article and study.

Pascal Finette @radical