Algorithms in the Criminal Justice System: The Law Society

Allbwn ymchwil: Cyfraniad at gynhadleddArall

Crynodeb

The Law Society established the Technology and the Law Policy Commission to examine the use of algorithms in the justice system of England and Wales. The Commission considered both simpler ‘hand-crafted’ systems and more complex, computationally generated ones such as machine learning. It held four public evidentiary sessions, interviewed over 75 experts, and read over 82 submissions of evidence and many more supplementary studies, reports and documents on the topic.

This report contains findings and recommendations concerning the use of algorithmic systems in the criminal justice system. The Commission considered a range of currently deployed systems that fell within this brief, including individual risk assessment and recidivism prediction; prospective crime mapping and hot-spotting; and mobile phone data extraction tools.

At the most basic level, the Commission has found a lack of explicit standards, best practice, and openness or transparency about the use of algorithmic systems in criminal justice across England and Wales. This was concerning, as the high-stakes decisions and measures taken in the justice system demand extremely careful deployment.

There are significant challenges of bias and discrimination, opacity and due process, consistency, amenability to scrutiny, effectiveness, disregard of qualitative and contextual factors, against a backdrop of the potential of these systems to more deeply change the nature of the evolution of the law. The Commission recommends that a National Register of Algorithmic Systems should be created as a crucial initial scaffold for further openness, cross-sector learning and scrutiny.

While many deployments are in a pilot or experimental stage, the Commission notes that the technologies being deployed are not so technically novel that they cannot be critically assessed by multi-disciplinary teams for their effectiveness, their conformity to real challenges, and their potential for unintended and undesirable side effects, particularly from optimising for some goals or aspects of an issue to the detriment of others. It is key that inhouse capacity is built and retained for overseeing and steering these systems, and that coordination occurs across the justice system to ensure this capacity is worldclass.

In-house capacity is only one piece of the puzzle. Governing algorithmic systems in criminal justice usually brings multi-dimensional tensions and value-laden choices to grapple. These tensions emerge at many different points in development, deployment and maintenance, and are usually not between a ‘bad’ and a ‘good’ outcome, but between different values that are societally held to be of similar importance. It is insufficient and unacceptable for the bodies and agencies involved to make these decisions alone, requiring instead the engagement of broad stakeholders including civil society, academia,
technology firms and the justice system more broadly. Risk of systems being gamed is real, but often overstated in relation to the risks from lack of openness, engagement, and the loss of trust in procedural justice and the rule of law. Such risks stem especially from what are effectively policy decisions baked into algorithmic systems being made invisibly and unaccountably by contractors and vendors. The Commission’s work has highlighted that such crucial, often political design choices should never be outsourced.

The Commission has also analysed the broader and often new legal framework that in part governs algorithmic systems in criminal justice. In the course of evidence-taking, the Commission became heavily concerned that some systems and databases operating Algorithms in the Criminal Justice System today, such as facial recognition in policing or some uses of mobile device extraction, lack a clear and explicit lawful basis. This must be urgently examined, publicly clarified and rectified if necessary. While the United Kingdom has more explicit provisions
covering algorithmic systems than many other parts of the world, these contain significant omissions and loopholes that need joined-up consideration. The Commission recommends several clarifications and changes to data protection legislation, procurement codes, freedom of information law, equality duties and statutory oversight and scrutiny bodies which would provide key safeguards to the integrity of criminal justice in the digital age.

Many of the heavily individualised, legal safeguards proposed to algorithmic systems in commercial domains, such as individual explanation rights, are unlikely to be very helpful in criminal justice, where imbalances of power can be extreme and are exacerbated by dwindling levels of legal aid. Societal, systemic oversight must be placed at the forefront of algorithmic systems in this sector, which will require innovative and worldleading policies. The United Kingdom has a window of opportunity to become a beacon for a justice system trusted to use technology well, with a social licence to operate and in line with the values and human rights underpinning criminal justice. It must take proactive steps to seize that window now.
Iaith wreiddiolSaesneg
StatwsCyhoeddwyd - 12 Meh 2019
DigwyddiadAlgorithms in the Criminal Justice System: The Law Society - Senedd, Cardiff, Y Deyrnas Unedig
Hyd: 12 Jun 201912 Jun 2019
https://michae.lv/static/papers/2019algorithmsjusticesystem.pdf

Cynhadledd

CynhadleddAlgorithms in the Criminal Justice System
Gwlad/TiriogaethY Deyrnas Unedig
DinasCardiff
Cyfnod12/06/1912/06/19
Cyfeiriad rhyngrwyd

Ôl bys

Gweld gwybodaeth am bynciau ymchwil 'Algorithms in the Criminal Justice System: The Law Society'. Gyda’i gilydd, maen nhw’n ffurfio ôl bys unigryw.

Dyfynnu hyn