Read news on red teamers with our app.
Read more in the app
LLM red teamers: People are hacking AI chatbots just for fun and now researchers have catalogued 35 “jailbreak” techniques
Anthropic claims new AI security method blocks 95% of jailbreaks, invites red teamers to try