How should AI be used in the justice system?
Artificial intelligence promises to support judges in their decision-making. In fact, it renders their decisions and tasks more complex, argues Ayisha Piotti.
- Read
- Number of comments
Artificial intelligence (AI) is transforming our society. Beyond the currently hyped systems such as ChatGPT or DALL-E, which can generate images, the technology has also found its way into the daily work of lawyers and even judges. At present, such applications are still in their early stages, but AI has the potential to change our justice system. Yet as well as promising significant benefits, it also raises new ethical and legal questions. Most importantly, it will change the roles and skill requirements for lawyers and judges.
By deploying AI, law firms and courts will be able to work more efficiently and automate repetitive tasks. Let’s hope this will reduce the notorious backlog of cases in courts today. AI-powered systems are already being used by lawyers, for example to analyse large amounts of data and contracts. And some courts in the United States are using AI systems to assist with sentencing decisions or predict an offender’s risk of recidivism.
However, there are pitfalls: today’s AI algorithms are often not transparent enough to meet the high demands for accountability in the justice system. Critics also fear that AI systems could perpetuate bias and discrimination. This is because the reliability of AI depends on the quality of the data input. If AI support systems are trained on biased data, this can lead to unfair outcomes.
To minimise such unintended consequences, it is imperative to rigorously test AI systems before deploying them. New regulatory guidelines are also needed. The EU rightly classifies the use of AI in the justice system as a high-risk application and will regulate it strictly – in terms of transparency, oversight and cybersecurity, among other things ?– in the AI Act currently under discussion.
For me, the justice system belongs to the sensitive and critical areas of our lives where AI systems must not replace humans altogether. To guarantee accountability, we need to keep humans in the loop; I’m convinced that humans are needed as guardians of AI. Judges in the future will not only have to pass judgements, but also decide how AI is used in their work. They will have to know the advantages and disadvantages of AI and keep an eye on the associated trade-offs and balance these in conscious decisions.
One such trade-off, for example, is that between speed and thoroughness. AI is undoubtedly fast, and some lawyers argue that “justice delayed is justice denied". An imperfect decision rendered today may therefore be better than a perfect decision not rendered at all.
“We need to keep humans in the loop.”Ayisha Piotti
Another trade-off is that between expediency and certainty. Today, there are still no standards governing the use of AI in the judiciary and its validity and reliability. Judges must use their judgement when they allow AI to support them in their work.
Often commercially developed
A third dilemma arises from the fact that today AI is often developed commercially rather than as an open-source technology. I think developing companies have a legitimate interest in protecting their trade secrets, but this clashes with our insistence that legal decisions be transparent and accountable. If we want to provide the justice system with AI probabilistic analyses of DNA matches and decision-making aids in sentencing, that means compromising on transparency in many cases.
AI has the potential to make the justice system more efficient, faster and qualitatively better, but it entails balancing trade-offs and making compromises. As in the case of judges, the use of AI technologies in many other aspects of our lives entails an urgent need to equip the “humans in the loop” with a high degree of AI-specific expertise and the self-confidence to make these difficult decisions. As a society, now is the time to collectively define what these trade-offs should be, and to build this capacity.