Skip navigation

EXPERT COMMENT: AI profiling: the social and moral hazards of ‘predictive’ policing

9th March 2018

Mike Rowe, Professor of Criminology at Northumbria University, discusses the moral and ethical issues of police use of AI predictions. 

A UK police force which was using an algorithm designed to help it make custody decisions has been forced to alter it amid concerns that it could discriminate against poor people.

Durham Constabulary has been developing an algorithm to better predict the risk posed by offenders and to ensure that only the most “suitable” are granted police bail. But the programme has also highlighted potential social inequalities that can be maintained through the use of these big data strategies.

This might seem surprising, since an apparent feature of such programmes is that they are apparently neutral: technocratic evaluations of risk based on information that is “value-free” (based on objective calculation, eschewing subjective bias).

In practice, the apparent neutrality of the data is questionable. It has been reported that Durham Police will no longer use postcodes as one of the data points in their model, since it has been argued that doing so perpetuates stereotypes about neighbourhoods that have negative consequences for all residents. For example, the increase in house insurance premiums and decrease in house prices.

‘The ratchet effect’

Even so, algorithms rely on data that reflects – and so perpetuates – inequalities in criminal justice practice. A powerful critique of these methods by US law professor Bernard Harcourt, notes that they “…serve only to accentuate the ideological dimensions of the criminal law and hardens the purported race, class, and power relations between certain offences and certain groups”.

Using models of risk as a basis for police decision-making means that those already subject to police attention will become increasingly profiled. More data on their offending will be uncovered. The focus on them will be intensified, leading to more offending identified – and so the cycle continues.

An unintended consequence of this is that those not subject to significant attention will be able to continue to offend with less hindrance. So the crack cocaine user buying drugs on the street is more likely to be caught in what Harcourt termed “the ratchet effect” than the middle-class professional ordering cocaine for delivery from the internet.

Big data policing

The application of “big data” – where complex algorithms mine vast swathes of information to make predictions about future behaviour – is increasingly being applied to policing and criminal justice.

It is easy to understand why. The possibility that increasingly scarce police resources can be targeted at individuals more likely to commit crime, or that decisions to grant bail can be made in a more reliable way so that only the most risky individuals are jailed before trial, are both attractive propositions.

After all, it can only benefit society if we can intervene before a crime is even committed. It would save resources and prevent the human, social and economic costs that offending produces.

Sensors, data sets and intelligence

Police services around the world are increasingly utilising AI to develop “predictive policing” in an attempt to replace the relatively ineffective traditional model whereby police respond to offences after the damage has been done.

Police services in the US have used complex data sets to predict potential spikes in crime. These data sets collate everything from dates and times to weather patterns, highly localised geographical information, social media messages, and even local sporting fixtures.

Some cities are using hidden webs of acoustic sensors to record gunshots, identify associated background noise and so – through collating vast numbers of examples – predict those sounds most often associated with firearms being discharged. Knowing that, when those associated noises are identified it is more likely that weapons will be fired.

Police work has always been based on intelligence and local information. As early as 1977, sociology professor William Sanders argued that detective work was essentially about information processing.

For much of the last two centuries, though, that basis has been limited to the intelligence an individual beat officer can collect and share on a fairly small scale with colleagues. The power to aggregate big data, and the technological capacity to push this information to frontline officers, transforms the power and reach of intelligence within policing.

While the use of AI predictions in police and law enforcement is still in its early stages, it is vital to scrutinise any warning signs that may come from its use. One standout example is a 2016 Pro Publica investigation which found that COMPAS software was biased against black offenders.

So society needs to maintain a critical perspective on the use of AI on moral and ethical grounds. Not least because the details of the algorithms, data sources and the inherent assumptions on which they make calculations are often closely guarded secrets. Those secrets are in the hands of the specialist IT companies that develop them who want to maintain confidentiality for commercial reasons. The social, political and criminal justice inequalities likely to arise should make us question the potential of predictive policing.

This article was originally published on The Conversation. You can view the original article here

a sign in front of a crowd
+

Northumbria Open Days

Open Days are a great way for you to get a feel of the University, the city of Newcastle upon Tyne and the course(s) you are interested in.

Research at Northumbria
+

Research at Northumbria

Research is the life blood of a University and at Northumbria University we pride ourselves on research that makes a difference; research that has application and affects people's lives.

NU World
+

Explore NU World

Find out what life here is all about. From studying to socialising, term time to downtime, we’ve got it covered.


Latest News and Features

a map showing areas of ice melt in Greenland
S2Cool project lead Dr Muhammad Wakil Shahzad
The Converted Flat in 2049, by the Interaction Research Studio, is one of seven period rooms built as part of the Real Rooms project which opened in July at the Museum of the Home in London.
The UK Centre for Polar Observation and Modelling (CPOM), based at Northumbria University, has been awarded over £400,000 by the European Space Agency to investigate tipping points in the Earth’s icy regions with a focus on the Antarctic. Photo by Professor Andrew Shepherd.
Nature Awards Inclusive Health Research
Some members of History’s editorial team (from left to right): Daniel Laqua (editor-in-chief), Katarzyna Kosior (reviews editor), Lewis Kimberley (editorial assistant), Charotte Alston (deputy editor) and Henry Miller (online editor).
Dr Elliott Johnson, Vice Chancellor’s Fellow in Public Policy at Northumbria University.
Balfour Beatty graduates at Northumbria's winter congregation

Back to top