British Police Develop 'Psycho-Pass' AI to Predict Crimes Before They Happen
posted on by Kim Morrissy
In a move straight out of the script of the dystopian sci-fi anime series Psycho-Pass, police in the UK have been developing an artificial intelligence that will predict how likely a person is to commit or be the victim of a violent crime. New Scientist reported on November 26 that the police intend to flag individuals with the system and intervene before crimes happen by offering preemptive counseling.
The system, called the National Data Analytics Solution (NDAS), uses a combination of AI and statistics from local and national police databases. Ian Donnelly, the police lead on the project, said that they have collected over a terabyte of data from the early phases of the project, including logs of committed crimes and about 5 million identifiable people.
The software found nearly 1,400 indicators from the data that could help predict crime. It was found that the number of crimes committed by people in an individual's social group were a strong factor in their likelihood of committing crime. Donnelly claims that the intent behind flagging individuals with a high risk indicator is not to arrest them before they commit crimes, but rather to provide them with support from local health or social workers.
The project began as a means of channeling limited police resources more effectively. Donnelly said that because police funding has been slashed in recent years, there is a need for system that can help the police prioritize on those who require interventions most urgently. The project has until the end of March 2019 to produce a prototype, with the ultimate hope that every police force in the UK could eventually use it.
Although the police will work with the UK's data watchdog, the Information Commissioner's Office, to ensure that the NDAS meets privacy regulations, the project has already drawn criticism from a team of scientists at the Alan Turing Institute in London, who cite "serious ethical issues" with the foundation of the project and question whether it is in the public good for the police to intervene with individuals who have not yet committed crimes.
Other researchers have highlighted concerns that the system will reinforce pre-existing biases against poor communities and people of color. For example, commercial face recognition software has repeatedly been shown to be less accurate on people with darker skin. Another issue is that it could restrict resources to areas that police already have extensive data from. Andrew Ferguson at the University of the District of Columbia said that arrests correlate with where police are deployed and are not representative of crime numbers overall. This tends to disproportionately affect groups that were already marginalized.
Around the world, police are increasingly using data to predict crime before it happens. PredPol, developed at Santa Clara University in California, is a software that identifies future crime hot spots. Earlier this year, Human Rights Watch criticized the Chinese authorities for preemptively detaining people in the province of Xinjiang using predictive policing. The future predicted by Psycho-Pass may not be so far off.