Get the latest tech news

NIST Releases an Open-Source Platform for AI Safety Testing


America's National Institute of Standards and Technology (NIST) has released a new open-source software tool called Dioptra for testing the resilience of machine learning models to various types of attacks. "Key features that are new from the alpha release include a new web-based front end, user a...

America's National Institute of Standards and Technology (NIST) has released a new open-source software tool called Dioptra for testing the resilience of machine learning models to various types of attacks. "Key features that are new from the alpha release include a new web-based front end, user authentication, and provenance tracking of all the elements of an experiment, which enables reproducibility and verification of results," a NIST spokesperson told SC Media: Previous NIST research identified three main categories of attacks against machine learning algorithms: evasion, poisoning and oracle. The free platform enables users to determine to what degree attacks in the three categories mentioned will affect model performance and can also be used to gauge the use of various defenses such as data sanitization or more robust training methods.

Get the Android app

Or read this on Slashdot

Read more on:

Photo of AI safety testing

AI safety testing

Photo of NIST

NIST

Photo of source platform

source platform

Related news:

News photo

NIST releases a tool for testing AI model risk

News photo

NIST and Gates Foundation to Develop Breathalyzers for Malaria and Tuberculosis

News photo

Attacking NIST SP 800-108 (Loss of Key Control Security)