AI Case Study
London Metropolitan Police plan automatically detecting illegal and inappropriate material on confiscated devices
The London Metropolitan Police plan to implement AI software which can identify inappropriate and illegal content, such as child pornography, stored on confiscated devices. Currently their software identifies guns and drugs.
Public And Social Sector
According to The Telegraph, currently the "digital forensics team uses bespoke software that can identify drugs, guns and money while scanning someone’s computer or phone. But it has proven problematic when searching for nudity." However, "Artificial intelligence will take on the gruelling task of scanning for images of child abuse on suspects' phones and computers so that police officers are no longer subjected to psychological trauma within "two to three years"."
Planned; results not yet available
Legal And Compliance
"The Metropolitan Police's digital forensics department, which last year trawled through 53,000 different devices for incriminating evidence, already uses image recognition software but it is not sophisticated enough to spot indecent images and video, Mark Stokes, the Met's head of digital and electronics forensics, told the Telegraph. 'We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans,' he said. Handing this work over to computers could save forensics specialists who spend their career trawling through pictures from psychological strain."