Colin Clark’s Post

View profile for Colin Clark, graphic

CISSP CISA CDPSE PCIP. Former PCI-QSA P2PE-QSA QPA with global experience of the payments and information security

So LLM’s hallucinate. Fact People are using LLM’s to create code. Fact The code includes packages that do not exist…and do it repeatedly So if you generate enough code using an LLM you will find the most common non existent packages it includes then you go create a malicious package using the hallucinated name. There you go, loads of companies including your malicious software in their software, they do all the work for you! Unlikely you say? Theoretically possible but would never happen you say? Wrong, it’s happened, and Alibaba are one of the companies that were impacted. 3 suggestions: 1) Created an SBOM and validate all packages, libraries, code, api’s used in your product so you know what you need to test and protect yourself from supply chain attacks 2) Don’t use LLM’s to write code without checking it thoroughly (which defeats the object of using LLM’s in the first place hence (3)) 3) Don’t use LLM’s in any business critical function This will only get worse. LLM’s are producing errors which are being used to train the next generation of LLM’s, Therefore the problem is self propagating - a true self eating worm https://lnkd.in/e_XKwGhA

AI bots hallucinate software packages and devs download them

AI bots hallucinate software packages and devs download them

theregister.com

To view or add a comment, sign in

Explore topics