Get the latest tech news
Open Source Coalition Announces 'Model-Signing' with Sigstore to Strengthen the ML Supply Chain
The advent of LLMs and machine learning-based applications "opened the door to a new wave of security threats," argues Google's security blog. (Including model and data poisoning, prompt injection, prompt leaking and prompt evasion.) So as part of the Linux Foundation's nonprofit Open Source Sec...
By binding an OpenID Connect token to a workload or developer identity, Sigstore alleviates the need to manage or rotate long-lived secrets. "We can view model signing as establishing the foundation of trust in the ML ecosystem..." the post concludes (adding "We envision extending this approach to also include datasets and other ML-related artifacts.") This has the potential to automate a significant fraction of the work needed to perform incident response in case of a compromise in the ML world...
Or read this on Slashdot