Artificial intelligence in criminal justice: Fair Trials' work
Law enforcement and criminal justice authorities are increasingly using artificial intelligence (AI) and automated decision-making (ADM) systems. These systems are often used to profile people, ‘predict’ their actions, and assess their risk of certain behaviour, such as committing a crime, in the future. This can have devastating consequences for the people involved, who are profiled as criminals or considered a risk even though they haven’t actually committed a crime.
We want States to prohibit the use of predictive, profiling and risk assessment AI and ADM systems in law enforcement and criminal justice. Only an outright ban can protect people from the fundamental harms they cause.