Mock crime prediction tool profiles MEPs as potential criminals

Developed by Fair Trials, the example crime prediction tool uses the same information as police systems to assess the likelihood of someone committing a crime in the future

Sebastian Klovig Skelton

By

Published: 16 Feb 2023 14:45

Several MEPs have been profiled as “at risk” of criminal behaviour after using mock crime prediction tool created by non-governmental organisation Fair Trials to highlight discriminatory and unfair nature of predictive policing systems.

The online tool – which asks users questions designed to draw out the kinds of information police across Europe are using to “predict” whether someone will commit a crime – was launched on 31 January 2023, and has been used by MEPs and members of the public from across the European Union (EU).

Predictive policing systems can be used to profile and make “predictions” about criminality in both individuals and locations, which Fair Trials said is determined by a range of data about education, family life and background, engagement with public services such as welfare benefits and housing, ethnicity, nationality, credit scores, and whether someone has previously been in contact with police, even as a victim.   

People profiled as “at risk” on the basis of this information face a range of serious consequences, from being subject to regular stop and search to having their children removed by social services. The profiles and predictions are also used to inform pre-trial detention, prosecution, sentencing and probation decisions.

As a result of the tool’s use, Fair Trials said more than 1,000 emails were subsequently sent to MEPs by members of the public calling on them to ban predictive policing systems in the EU’s upcoming Artificial Intelligence (AI) Act.

“Our interactive predictive tool shows just how unjust and discriminatory these systems are. It might seem unbelievable that law enforcement and criminal justice authorities are making predictions about criminality based on people’s backgrounds, class, ethnicity and associations, but that is the reality of what is happening in the EU,” said Griff Ferris, senior legal and policy officer at Fair Trials.

“There’s widespread evidence that these predictive policing and criminal justice systems lead to injustice, reinforce discrimination and undermine our rights. The only way to protect people and their rights across Europe is to prohibit these criminal prediction and profiling systems, against people and places.”

Socialists and Democrats (S&D) MEP Petar Vitanov, who was profiled by the mock tool as having a “medium risk” of committing a crime in the future, said there is no place for such unreliable, biased and unfair systems in the EU.

“I have never thought that we will live in a sci-fi dystopia where machines will ‘predict’ if we are about to commit a crime or not,” he said. “I grew up in a low-income neighbourhood, in a poor Eastern European country, and the algorithm profiled me as a potential criminal.”

Renew MEP and member of the legal affairs committee, Karen Melchior, who was profiled as “at risk”, added that the automated judging of people’s behaviour will lead to discrimination and irrevocably alter people’s lives.

“We cannot allowed funds to be misplaced from proper police work and well-funded as well as independent courts to biased and random technology,” she said. “The promised efficiency will be lost in the clean up after wrong decisions – when we catch them. Worst of all, we risk destroying lives of innocent people. The use of predictive policing mechanisms must be banned.”

Other MEPs profiled as a risk by the tool, and who subsequently expressed their support for banning predictive policing systems, include Cornelia Ernst, Tiemo Wölken, Petar Vitanov, Birgit Sippel, Kim van Sparrentak, Tineke Strik and Monica Semedo.

Civil society groups such as Fair Trials, European Digital Rights (EDRi) and others have long argued that because the underlying information used in predictive policing systems is drawn from data sets that reflect the historical structural biases and inequalities in society, the use of such systems will result in racialised people and communities being disproportionately targeted for surveillance, questioning, detaining and, ultimately, imprisonment by police.

In March 2020, evidence submitted to the United Nations (UN) by the UK’s Equalities and Human Rights Commission (EHRC) said the use of predictive policing could replicate and magnify “patterns of discrimination in policing, while lending legitimacy to biased processes”.

Lawmakers have come to similar conclusions. In the UK, for example, a House of Lords inquiry into the police use of algorithmic technologies noted that predictive policing tools tend to produce a “vicious circle” and “entrench pre-existing patterns of discrimination” because they direct police patrols to low-income, already over-policed areas based on historic arrest data.

“Due to increased police presence, it is likely that a higher proportion of the crimes committed in those areas will be detected than in those areas which are not over-policed. The data will reflect this increased detection rate as an increased crime rate, which will be fed into the tool and embed itself into the next set of predictions,” it said.

Although the two MEPs in charge of overseeing and amending the EU’s AI Act said in April 2022 that the use of AI-powered predictive policing tools to make “individualised risk assessments” should be prohibited on the basis that it “violates human dignity and the presumption of innocence”, the proposed prohibition only extended to individualised assessments and not place-based predictive systems used to profile areas or locations.

Sarah Chander, a senior policy adviser at EDRi, told Computer Weekly at the time that profiling neighbourhoods for the risk of crime has a similar effect to profiling individuals, in that it “can increase experiences of discriminatory policing for racialised and poor communities”.

Civil society groups have therefore called – on multiple occasions – for predictive policing systems to be completely banned.

While amendments are constantly being made to the AI Act, the limited ban on predictive policing systems has not yet been extended to place-based systems. MEPs next vote on the act is due to take place sometime at the end of March 2023, with the exact date yet to be confirmed.

Read more on Artificial intelligence, automation and robotics

Read More
Rubi Serna

Latest

RubyPlay partners with Caesars Entertainment in Ontario to advance North American expansion

RubyPlay, a studio-based content ecosystem, is further strengthening its presence in Ontario as part of its broader North American growth strategy with a new partnership with Caesars Entertainment. The partnership will see a curated selection of RubyPlay’s fan-favourite titles, including JMania® Lucky Pyggs, Mad Hit® Mr Coin and Diamond Explosion® 7s SE, made available on

Wizkid wins “Best African Music Act” at the 2026 MOBO Awards, beats Davido, Tyla, Rema

MusicRead Later (0)Please login to bookmark Close Nigerian superstar Wizkid...

Newsletter

Don't miss

RubyPlay partners with Caesars Entertainment in Ontario to advance North American expansion

RubyPlay, a studio-based content ecosystem, is further strengthening its presence in Ontario as part of its broader North American growth strategy with a new partnership with Caesars Entertainment. The partnership will see a curated selection of RubyPlay’s fan-favourite titles, including JMania® Lucky Pyggs, Mad Hit® Mr Coin and Diamond Explosion® 7s SE, made available on

Wizkid wins “Best African Music Act” at the 2026 MOBO Awards, beats Davido, Tyla, Rema

MusicRead Later (0)Please login to bookmark Close Nigerian superstar Wizkid...

South Block Continues Rapid Expansion Adding 24th Block in Burke, Virginia, March 28

MusicFirst 100 grand opening guests score free Mini...

Family Business? Tee Grizzley Reacts After His Mom Accuses Him Of Leaving Her To Struggle (PHOTOS)

Y’all… it looks like some family tension might be brewing behind the scenes involving Tee Grizzley and his mom. What seemed like a regular social media post quickly turned into something deeper. And now, folks are side-eyeing the situation and wondering what’s really going on. RELATED: Tee Grizzley Shares A Message For Artists After His

SoE necessary but not sufficient, business leaders say

PE­TER CHRISTO­PHER Se­nior Mul­ti­me­dia Re­porter pe­ter.christo­pher@guardian.co.tt Heavy hand­ed but nec­es­sary giv­en the state of crime in T&T. This was a com­mon as­sess­ment from var­i­ous busi­ness groups when asked for their per­spec­tive on the lat­est de­c­la­ra­tion of a state of emer­gency in the coun­try. The T&T Cham­ber of In­dus­try and Com­merce, in a re­leased is­sued yes­ter­day

The Big Business of Carolyn Bessette-Kennedy

Can a nine-episode limited series really impact an entire season of shopping trends? Today brands are experiencing—and chasing—the “Carolyn Bessette-Kennedy effect” as a result of Ryan Murphy’s Love Story. And in many cases, it’s more pervasive than they could have prepared for. The FX series, based on the relationship between John F. Kennedy Jr. and