Home Page Link Contact Us Link Site Map link

News

Met brings in artificial intelligence to stop mental scarring

Published: Tuesday 9 January 2018

Forensic departments may soon be able to avoid the grim task of trawling through images of child abuse on suspects’ phones and computers.

The Metropolitan Police's digital forensics department has ambitious plans to utilise artificial intelligence (AI) software which will automatically flag up any images it deems to be inappropriate.

The forensics team, which last year sifted through 53,000 different devices for incriminating evidence, already uses image recognition software but cannot accurately detect nudity.

Mark Stokes, the Met's head of digital and electronics forensics, says the psychological trauma caused by viewing hundreds of haunting images could be avoided in the future. He told The Telegraph: "We have to grade indecent images for different sentencing, and that has to be done by human beings right now, but machine learning takes that away from humans," he said.  

"You can imagine that doing that for year-on-year is very disturbing." 

The force is currently drawing up a pioneering plan to move its sensitive data to cloud providers such as Google or Microsoft.

The Met currently uses a London based data centre but the sheer volume of images along with the popularity of high-resolution video is putting pressure on resources. This would be alleviated by harnessing the tech giants’ storage space, which is considerably larger.

Additionally, with the help of Silicon Valley providers, AI can be trained to detect abusive images "within two-three years".

The Met's digital forensics team uses bespoke software that can identify drugs, guns and money while scanning someone’s computer or phone. But it has proven problematic when searching for nudity. 

On numerous occasions, the programme has flagged nudity up when an image of a desert appears due to its skin-like colour.

Other issues include moving the Met’s data into the cloud due to the sensitive nature of the files the force stores.

Police staff are granted consent from the courts to store criminal images, but it is an offence for anyone else – including Microsoft or any cloud provider to store them. Providers could be taking on an incredible risk associated with storing this material.

The Met, in collaboration with the tech giants, is currently devising a potential IT plan which will cover the terms and conditions.