Summary: Researchers have identified a new supply chain attack vector dubbed “Slopsquatting.” Attackers are registering package names on npm and PyPI that do not exist but are frequently “hallucinated” by AI coding assistants like Copilot and ChatGPT. Developers trusting these AI suggestions are inadvertently installing malware-laden dependencies that were registered by attackers just days prior.
Business Impact: This completely bypasses traditional “Typosquatting” defenses. Your developers aren’t making typos; they are copying “valid-looking” suggestions from trusted AI tools. This introduces a stealthy injection point for infostealers directly into your internal applications.
Why It Happened: LLMs are probabilistic, not factual. They often “invent” logical-sounding package names (e.g., `aws-s3-helper-v2`) to solve a prompt. Attackers are simply preemptively registering these hallucinated names to catch the traffic.
Recommended Executive Action: Update your “AI Coding Policy” immediately. Mandate that developers verify the age and download count of any AI-suggested library before installation. Configure your artifact manager (Artifactory/Nexus) to block packages younger than 30 days.
Hashtags: #Slopsquatting #AISecurity #SupplyChain #DevSecOps #HallucinationAttack #AppSec
