Get the latest tech news

How to detect poisoned data in machine learning datasets


A proactive, coordinated effort can reduce the chances that manipulations will impact model performance and protect algorithmic integrity.

With careful, proactive detection efforts, organizations could retain weeks, months or even years of work they would otherwise use to undo the damage that poisoned data sources caused. The good news is that organizations can take several measures to secure training data, verify dataset integrity and monitor for anomalies to minimize the chances of poisoning. Although ML dataset poisoning can be difficult to detect, a proactive, coordinated effort can significantly reduce the chances manipulations will impact model performance.

Get the Android app

Or read this on Venture Beat

Read more on:

Photo of poisoned data

poisoned data