Get the latest tech news
London Underground Is Testing Real-Time AI Surveillance Tools to Spot Crime
In a test at one station, Transport for London used a computer vision system to try and detect crime and weapons, people falling on the tracks, and fare dodgers, documents obtained by WIRED show.
Thousands of people using the London Underground had their movements, behavior, and body language watched by AI surveillance software designed to see if they were committing crimes or were in unsafe situations, new documents obtained by WIRED reveal. In the trial at Willesden Green—a station that had 25,000 visitors per day before the Covid-19 pandemic—the AI system was set up to detect potential safety incidents to allow staff to help people in need, but also partly focused on criminal and antisocial behavior. The categories the system tried to identify were: crowd movement, unauthorized access, safeguarding, mobility assistance, crime and anti-social behavior, person on the tracks, injured or unwell people, hazards such as litter or wet floors, unattended items, stranded customers, and fare evasion.
Or read this on Wired