Artificial Intelligence may one day help the UK police force to predict crime before it happens. But unlike the movie Minority Report, the police plan friendlier countermeasures as opposed to a SWAT team dropping down from a hovercraft.
The system under development has been dubbed the National Data Analytics Solution (NDAS). It will combine AI and statistics to review individual data and asses the risk of becoming a victim of gun or knife crime. It will also gauge the likelihood of someone falling into modern slavery, as exclusively reported by New Scientist.
The project aims to roll out a prototype system by March 2019, with the West Midland Police force taking the lead. London’s Metropolitan Police, Greater Manchester Police and six other forces are also involved in the development of the system.
Once active, the system will comb through the files of individuals already known to police officers. Hits will be prioritized based on highest likelihood. These individuals will then be nominated for interventions and pre-emptive counseling.
Science fiction fans can rest easy for now that helmeted individuals in overalls and armed to the teeth crashing through windows is not the UK police’s idea of intervention.
Project leader Iain Donnelly said they have already gathered more than a terabyte of data from both local and national police databases. These records include around 5 million individuals who have been stopped and searched. The data also had a log of their crimes if there are any. Past cases will help police plot out a trajectory so they can keep an eye on people who may be prone to violence and offenses.
While the aims are well intentioned, “serious ethical issues” have been pointed out by the Alan Turing Institute in London. Andrew Ferguson, of the University of the District of Columbia, chimed in that the system may reinforce biases and disproportionately affect people of color and residents living in poor neighborhoods.
Martin Innes, director of the Crime and Security Research Institute at Cardiff University, UK, also voiced his skepticism for the system to reliably predict offenses at an individual level. Instead, he suggested NDAS may be better used at evaluating communities.
All these concerns will either be defunct or verified after Innes and his colleagues evaluate the NDAS system in greater detail.