Get the latest tech news
NIST Releases an Open-Source Platform for AI Safety Testing
America's National Institute of Standards and Technology (NIST) has released a new open-source software tool called Dioptra for testing the resilience of machine learning models to various types of attacks. "Key features that are new from the alpha release include a new web-based front end, user a...
America's National Institute of Standards and Technology (NIST) has released a new open-source software tool called Dioptra for testing the resilience of machine learning models to various types of attacks. "Key features that are new from the alpha release include a new web-based front end, user authentication, and provenance tracking of all the elements of an experiment, which enables reproducibility and verification of results," a NIST spokesperson told SC Media: Previous NIST research identified three main categories of attacks against machine learning algorithms: evasion, poisoning and oracle. The free platform enables users to determine to what degree attacks in the three categories mentioned will affect model performance and can also be used to gauge the use of various defenses such as data sanitization or more robust training methods.
Or read this on Slashdot