Criminal Legal Algorithms, Technology, and Expertise (CLATE)
Criminal Legal AI | 2024-today
Carceral algorithms encompass the broad category of algorithmic, automated, and data-driven practices employed in the criminal legal system. They can be as simple as a checklist or involve deep learning and complex statistical models, but their influence extends beyond technical capacity. While often introduced as part of an “objectivity campaign” that positions the technology as more impartial, objective, and scientific than human decision-making, in practice these algorithms rely on human decision-makers in ways that can create tensions in established regulatory structures, reinforce or obfuscate existing biases, and expand the scope of carceral systems.
The Criminal Legal Algorithms, Technology, and Expertise (CLATE) project investigates how introducing such tools destabilizes work practices, legal frameworks, and the legitimacy of expert authority. Drawing on a combination of interviews, legal analysis, and quantitative data, we explore how algorithms challenge decision-making processes in policing, prosecution, and how expertise gets wielded. We compare how these dynamics unfold across international contexts and different technological interventions such as probabilistic DNA profiling, facial recognition technology, risk assessment instruments, and predictive policing.
Leads: Hannah Pullen-Blasnik and Julien Larregue
Research Highlights
-
Published in 2023 in Social Studies of Science, 54 (1), 30-58. Read here.
Abstract
What happens when an algorithm is added to the work of an expert group? This study explores how algorithms pose a practical problem for experts. We study the introduction of a Probabilistic DNA Profiling (PDP) software into a forensics lab through interviews and court admissibility hearings. While meant to support experts’ decision-making, in practice it has destabilized their authority. They respond to this destabilization by producing alternating and often conflicting accounts of the agency and significance of the software. The algorithm gets constructed alternately either as merely a tool or as indispensable statistical backing; the analysts’ authority as either independent of the algorithm or reliant upon it to resolve conflict and create a final decision; and forensic expertise as resting either with the analysts or with the software. These tensions reflect the forensic ‘culture of anticipation’, specifically the experts’ anticipation of ongoing litigation that destabilizes their control over the deployment and interpretation of expertise in the courtroom. The software highlights tensions between the analysts’ supposed impartiality and their role in the courtroom, exposing legal and narrative implications of the changing nature of expertise and technology in the criminal legal system.