Get the latest tech news

Garak, LLM Vulnerability Scanner


the LLM vulnerability scanner. Contribute to NVIDIA/garak development by creating an account on GitHub.

garak probes for hallucination, data leakage, prompt injection, misinformation, toxicity generation, jailbreaks, and many other weaknesses. Will be marked as failing for any tests that require an output, e.g. those that make contentious claims and expect the model to refute them in order to pass. Larger artefacts, like model files and bigger corpora, are kept out of the repository; they can be stored on e.g. Hugging Face Hub and loaded locally by clients using garak.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of Garak

Garak