AI Case Study
London Metropolitan Police plan automatically detecting illegal and inappropriate material on confiscated devices
The London Metropolitan Police plan to implement AI software which can identify inappropriate and illegal content, such as child pornography, stored on confiscated devices. Currently their software identifies guns and drugs.
Industry
Public And Social Sector
Security
Project Overview
According to The Telegraph, currently the "digital forensics team uses bespoke software that can identify drugs, guns and money while scanning someone’s computer or phone. But it has proven problematic when searching for nudity." However, "Artificial intelligence will take on the gruelling task of scanning for images of child abuse on suspects' phones and computers so that police officers are no longer subjected to psychological trauma within "two to three years"."
Reported Results
Planned; results not yet available
Technology
Function
Legal And Compliance
Legal
Background
"The Metropolitan Police's digital forensics department, which last year trawled through 53,000 different devices for incriminating evidence, already uses image recognition software but it is not sophisticated enough to spot indecent images and video, Mark Stokes, the Met's head of digital and electronics forensics, told the Telegraph. 'We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans,' he said. Handing this work over to computers could save forensics specialists who spend their career trawling through pictures from psychological strain."
Benefits
Data