Are Crime Prediction Analytics Discriminatory or Life-Saving?

Critics of the program warn it will lead police to make arrests based on a bias. Proponents say that the analysis and interpretation of data will help police effectively manage their scarce resources.

Haven Miller
DataDrivenInvestor
Published in
4 min readFeb 15, 2019

--

It sounds like something out of the dystopian movie Minority Report, but crime prediction analysis is already occurring or in development at 14 police forces in the United Kingdom. Started as a project called the National Data Analytics Solution (NDAS) run by West Midlands Police, the analytics program is becoming more popular in the UK to the horror and fascination of some citizens. A popular fashion magazine Dazed even said that the program had “robocop vibes”. So how does this controversial predictive software work?

Richard Baker/In Pictures/Getty

NDAS evolved from Data Driven Insights, a project that the West Midlands Police and Crime Commissioner funded to the tune of £16 million. The superintendent of West Midlands Police and NDAS project manager Iain Donnelly described the goal of the project as seeking to “access multiple systems from a single screen [and find out] everything there is to know about a person or location”. Through the program, the West Midlands Police force has formed its own in-house data lab with a team of full-time data scientists to conduct the analysis and interpretation of the data. In total, the analysis includes 1400 indicators and 30 factors in order to evaluate an individuals’ inclination to violent crime.

The data sets covered by the program include details of all relevant interactions with the criminal justice system — including information on crime recording, crime reporting, convictions, and custody data — as well as command and control, crime intelligence, and some internal police HR data.

Then predictive data analysis is used to identify people who are considered to be at the highest risk of becoming perpetrators of knife and gun violence. They looked at a group of felons who have committed gun or knife crimes, and evaluated their data sets in order to find a group of predictive indicators that led to them committing a violent crime. Equipped with that knowledge, police believe they can identify individuals on a similar track and intervene to offer support before they get to the point of gun or knife crime. Superintendent Donnelly contends that the program’s future success will lead to more data use from sources such as education, health, or social services in order to make clearer predictions and insights.

It does not take much imagination to see a situation where this technology is used in a discriminatory way to make conclusions about a person based on their past. Police intend to use Artificial Intelligence (AI) coupled with their data analysis to make the predictions, but human rights groups argue that this will only further embed human biases into our criminal justice system. Groups such as the American Civil Liberties Union (ACLU), the Alan Turing Institute, and the Brennan Center for Justice have all criticized the program. However, The West Midlands Police Force and other UK police forces have created an ethics panel to look into their concerns towards the project. Strategic Adviser to NDAS Tom McNeil says: “We want to see analytics being used to justify investment in social mobility in this time of harmful austerity, addressing deep-rooted inequalities and helping to prevent crime.”

McNeil raises another point. The program could actually help level the playing field for systematic inequalities by identifying the root of violent crime and creating interventions before people get to that point. Using this logic, it is not only a life-saving software, but life-changing. It’s technology with the ability to place vulnerable people on less-violent paths and ultimately transforming their lives.

However, due to issues of transparency and the data on the project being closed to the general public, people remain skeptical. Liberty, a prominent human rights organization in the UK, has warned that NDAS threatens freedom of expression, privacy, and can encourage racial profiling and discrimination. Although police forces may be acknowledging biases and attempting to incorporate ethics into the program, the fact is that much of the data being used is “already imbued with discrimination and bias from way people policed in the past” and that was “entrenched by algorithms” explains Hannah Couchman, Liberty’s policy and campaigns officer.

It is quite clear that there are clear moral and ethical dilemmas that must be discussed and considered before further expanding the program. Although the incorporation of data analytics into public safety can improve crime statistics, it can also have dire ethical ramifications for our society that merits future debate.

DDI Featured Data Science Courses:

*DDI may receive affiliate commission from these links. We appreciate your continued support.

--

--