Police forces and criminal justice authorities across Europe are using data, algorithms and artificial intelligence (AI) to ‘predict’ if certain people are at ‘risk’ of committing crime or likely to commit crimes in future, and whether and where crime will occur in certain areas in future.
We at Fair Trials are calling for a ban on ‘predictive’ policing and justice systems. Take the quiz below to see if you’d be profiled or seen as a ‘risk’ – and find out how to support our campaign.
Unlike the authorities, we will of course not collect or retain any information about you or your answers!
These automated and algorithmic systems are often secret and opaque, with authorities refusing to provide information on how they work. However, we will of course explain our ‘algorithm’. All of the questions in our example profiling tool are matched directly to information which is actively used by law enforcement and criminal justice authorities in their own versions of predictive and profiling systems and databases. The reality is that if a person fits just a few of the pieces of information (as asked by our questions), it can be enough to be marked as a ‘risk’. Likewise, if an area fits a similar profile, it too will be marked as at ‘risk’ of crime occurring. These assessments are obviously discriminatory and unjust – we have made our own transparent and explainable version to show just how discriminatory and unjust these systems are.
0 — 3 ‘Yes’ answers: ‘Low’ risk outcome
4 — 5 ‘Yes’ answers: ‘Medium’ risk outcome
6 — 10 ‘Yes’ answers: ‘High’ risk outcome