DOI: https://doi.org/10.58248/HS77

Teaser text

How can the police and wider criminal justice system best balance the potential benefits and risks of using new technologies?

Overview

A range of new technologies and applications are increasingly available to police forces and the wider criminal justice system (CJS).

Police forces are already making use of new technologies for:[i]

  • crime prevention, for example predictive policing using AI to predict hotspots for future crime (PN708)
  • mobility, such as autonomous vehicles and drones
  • identification and tracing, for example, automatic number plate recognition
  • surveillance and sensing, including face recognition technologies
  • analytics, using AI-enabled approaches
  • communications and interconnectivity, such as automated updates for victims or AI technology used to support call and response routing

Since 2016, a government programme has introduced more technology to the courts service with the aim of making the system “more straightforward, accessible and efficient”.[ii] This includes bringing services online, for example, HM Courts and Tribunal Service plan to move to a new video hearings service in 2024.[iii]

Online, cyber-enabled, and AI-enabled crime is increasing (see HS56 Cyber Crime and Harm). For example, developments in fraud (PN720), grooming and ‘sextortion’, stalking, and ransomware incidents[iv],[v],[vi],[vii] have direct implications for crime prevention, investigation and victim support (see Horizon Scan article on Victims’ Access to Justice and Support).[viii],[ix],[x],[xi],[xii] Perpetrators of online crimes may be harder to identify and can operate from any location.7 The Online Safety Act 2023 introduced new criminal offences from January 2024, including cyberflashing, threatening communications and intimate image abuse.[xiii]

Challenges and opportunities

Government, police forces and other sector stakeholders have noted potential benefits to using new technologies in the CJS (PN708):

  • improving performance, for example, in preventing and detecting crime, using Facial Recognition Technology (FRT) to identify criminals, or speeding up court procedures[xiv],[xv],[xvi],[xvii],[xviii],[xix]
  • improving efficiency by freeing up police time and capacity for higher priority work16,19,[xx]
  • making it easier for individuals and communities to communicate and interact with the police,1,17 including for people with disabilities or where English is not their first language[xxi],[xxii]
  • supporting transparency, accountability and trust,14,[xxiii] for example, the use of body worn cameras16,[xxiv],[xxv],[xxvi]

Parliamentary, academic and sector stakeholders debate several areas of concern or risk arising from the introduction of new technologies.

There has been public discussion about how the police and CJS must balance rights of individuals with societal safety in introducing new technologies.15,[xxvii] An independent report for the Scottish government emphasised the importance of “a rights based, ethical, evidence-based, consultative approach to innovation and adoption of emerging technologies in policing, within a robust oversight framework”.[xxviii] Aspects discussed include:

  • ethical and human rights, civil liberties and the right to a fair trial (PN708)28,[xxix]
  • data protection29
  • validity and accuracy of technologies15,28
  • risks around equality and discrimination24,28
  • right to privacy (PN708), for example, from access to smart doorbell footage[xxx],[xxxi]

Debate has already arisen in relation to police use of facial recognition technology (FRT).[xxxii] Academics and civil liberties groups have raised multiple concerns, including the intrusiveness of FRT and its impact on privacy rights, and a lack of oversight, accountability and transparency (PN708).19,27,29,32 There is particular debate about the relative accuracy of FRT for different demographic groups, and the risk that its use exacerbates discrimination (PN731).19,32 A 2023 National Physical Laboratory evaluation for the Metropolitan Police Service found no differences in accuracy of identification by age, gender or ethnicity for Retrospective or Operator-Initiated FRT. However, for Live FRT, it found some differences in accuracy for people aged under 20 and for Black subjects, with lower face-match thresholds.[xxxiii] (Lower thresholds mean the FRT system is less strict, increasing the risk of matching an innocent person with an individual on an offender watchlist.)

The Chief Scientific Adviser to the Police, and academic commentators, have noted that maintaining public trust remains a major challenge to the introduction of technology,1,[xxxiv] and emphasised the importance of public communication and transparency.1,27,28,29,34

Researchers have also highlighted the potential links between public trust in new technology and ‘policing by consent’, securing legitimacy and public compliance and co-operation.17,26,[xxxv] ,[xxxvi] In particular, little is known about the impact of the existing use of technology in police-public interactions (for example, chatbots for crime reporting, social media engagement with communities, and body-worn cameras) on ‘policing by consent’.26,36,[xxxvii]

A 2022 Justice and Home Affairs Committee report on the use of new technologies in the CJS also highlighted challenges around lack of oversight, regulation, scrutiny of application and information on what is being used, where, and how.15,[xxxviii] It, supported by research, noted the potential role of data ethics committees.15,[xxxix] There are also existing challenges to AI governance, including around bias, privacy, consent and transparency of algorithms (AI in policing and security).[xl] In 2023, the government proposed to abolish the role of the Biometrics and Surveillance Camera Commissioner (BSCC), and its governance and oversight responsibilities, through the Data Protection and Digital Information Bill.[xli] An independent report suggested that the abolition could leave gaps in future oversight.41,[xlii] However, the bill did not progress before the July 2024 election.[xliii] The incumbent BSCC resigned in mid-August 2024.[xliv]

The Justice and Home Affairs Committee, and other stakeholders, have raised concerns about how well procurement and the market for AI technologies work, for example in terms of consistency of purchasing, user knowledge of products, selling practices, and security concerns about suppliers.15,38 For example, a research study noted that a lack of consistent and transparent standards for the use of algorithms in technology used by the police could increase the risk of bias in their use, increase costs and inefficiencies, and lower public trust and acceptability.[xlv]

In relation to digital forensics, another study found that the need to keep pace with technological advances had significant implications for purchase and maintenance costs, and ensuring user compliance and accreditation. It recommended a review of whether strategic investment in policing co-access to equipment and personnel would help address such challenges.[xlvi] The independent report for the Scottish government included recommendations around improving business cases and evidence assessments for new technologies, adopting appropriate standards, and developing a public operational practice code.28

There are also significant challenges around securing the right, and enough, workforce skills to deploy new technologies, whether operated by experts or non-specialists.1,46,[xlvii] The Justice and Home Affairs Committee recommended better training for staff on the legislative context, the risk of bias, and how to use the tools.15 Research studies note further internal barriers to adopting new technology, including migration to new systems, securing financial and political support, and unclear guidelines and procedures.33,46

The growing role of technology in crime, and the increase in online, AI- and cyber-enabled crime means police must increasingly process and analyse digital material and evidence quickly. This can include text, media and metadata; footage from video, smart doorbells, smartphones, and social media; and materials from developing platforms such as virtual reality, online gaming, and the metaverse (PB61).1,6 Academic studies report challenges around the scale of demand for these digital forensic examinations and how best to manage the rising demand.46,[xlviii]

For the court service, the Justice Committee noted in 2022 that there were delays to the government’s programme, and a lack of data and analysis to establish the effectiveness of the programme.[xlix] The Committee found that, despite potential benefits for efficiency and public/media access, the use of online procedures and digitisation in courts may have reduced transparency, with variations in accessibility and information quality.23

Key uncertainties/unknowns

Many future applications of technology in the CJS are being explored,[l] for example, forensic analysis of genetics and digital evidence, or remote sensing of drugs or weapons (PN731).1 For example, researchers debate the extent to which AI could appropriately be used in sentencing to reduce potential bias, increase consistency or speed up access to justice.40,[li] In addition to general issues around the use of AI (AI in policing and security),40 specific challenges for sentencing include how to define and apply considerations such as proportionality, harm, culpability and the seriousness of a crime.[lii],[liii]

Key questions for Parliament

  • What technologies are being used in the CJS, by who, why and how?
  • Which technologies and applications have the highest risks?
  • How well do mechanisms of oversight, regulation, accountability and transparency work for new technologies in the CJS?
  • How is the government addressing potential gaps in oversight and regulation, including the future of the BSCC?
  • Do the government, police and wider CJS sufficiently understand and evaluate what new technologies and applications are effective?
  • How can the government and CJS best manage the public impacts of new technologies, including on equality and human rights, civil liberties, the right to a fair trial, and privacy?
  • Does the police and CJS workforce have, or have access to, the right skills and resources to deploy new technologies effectively?
  • Is the procurement market for new technologies used by the CJS working well for buyers and suppliers?

References

[i] Chief Scientific Adviser to the Police. Policing Areas of Research Interest Version 0.1 Accessed 11 October 2024.

[ii] HM Courts & Tribunals Service (2024). The HMCTS Reform Programme

[iii] Law Society (2024). Remote hearings

[iv] See for example the following reports and articles: Office for National Statistics (2022). Nature of fraud and computer misuse in England and Wales: year ending March 2022; UK Finance (2022). Over £1.2 billion stolen through fraud in 2022, with nearly 80 per cent of APP fraud cases starting online; NSPCC (2022). Online grooming crimes have risen by more than 80% in four years; Home Office (2021). Tackling violence against women and girls strategy; National Cyber Security Centre and National Crime Agency (2023). Ransomware, extortion and the cyber crime ecosystem White paper

[v] See for example the following media articles: The Times (2024). Police investigate death of teenager targeted by online blackmail; BBC News (2016). Bid to extradite man over Daniel Perry ‘sextortion’ death; Washington Post (2024). Bobbi Althoff deepfake spotlights X’s role as a top source of AI porn; The Guardian (2024). Anyone could be a victim of ‘deepfakes’. But there’s a reason Taylor Swift is a target

[vi] Todd, C. et al. (2020). Technology, cyberstalking and domestic homicide: informing prevention and response strategies, Policing and Society, 31(1), 82–99.

[vii] Cross, C. et al. (2022). “If U Don’t Pay they will Share the Pics”: Exploring Sextortion in the Context of Romance Fraud Victims & Offenders, 18(7), 1194–1215.

[viii] Horgan, S (2021). SJF Briefing: The reality of ‘cyber awareness’: findings and policy implications for Scotland The Scottish Centre for Crime & Justice Research.

[ix] Correia, S. G. (2022). Making the most of cybercrime and fraud crime report data: a case study of UK Action Fraud, International Journal of Population Data Science, 7(1).

[x] Scheuerman, M.K. et al. (2021). A Framework of Severity for Harmful Content Online, Proceedings of the ACM on Human-Computer Interaction, Volume 5, Issue CSCW2.

[xi] Caton, S. and Landman, R. (2021). Internet safety, online radicalisation and young people with learning disabilities British Journal of Learning Disabilities, Volume 50, Issue 1, pp: 88–97.

[xii] Ryan, F. (2019). Online abuse of disabled people is getting worse – when will it be taken seriously? The Guardian

[xiii] Department for Science, Innovation & Technology (2024). Online Safety Act: explainer

[xiv] Metropolitan Police. Turnaround Plan 2023-2025

[xv] House of Lords Justice and Home Affairs Committee (2022). Technology Rules? The advent of new technologies in the justice system

[xvi] Guzik, K. et al. (2019). Making the material routine: a sociomaterial study of the relationship between police body worn cameras (BWCs) and organisational routines Policing and Society: Vol 31, No 1: Policing and Technology

[xvii] Higgins, A. and Halkon, R. (2023). Contact and Confidence in a Digital Age: Improving police-public relations with technology Police Foundation

[xviii] HM Courts & Tribunals Service (2024). Fact sheet: Single Justice Service

[xix] The Standard (2023). Facial recognition vans: the case for and against

[xx] McGuire, M. R. (2020). The laughing policebot: automation and the end of policing Policing and Society: Vol 31, No 1: Policing and Technology

[xxi] EVAW Coalition (2023). Listen to us! Communication Barriers: How Statutory Bodies are failing Black, minoritized, Migrant, Deaf & Disabled Women and Girls Victims/Survivors of VAWG

[xxii] Orell, C. (2023). Use of Augmentative and Alternative Communication in Police Interviews All Voices

[xxiii] Justice Committee (2022). Open justice: court reporting in the digital age UK Parliament

[xxiv] Home Affairs Committee (2021). The Macpherson Report: twenty-one years on UK Parliament

[xxv] Lum, C. et al. (2020). Body‐worn cameras’ effects on police officers and citizen behavior: A systematic review Campbell Systematic Reviews, Volume 16, Issue 3.

[xxvi] Bradford, B., et al. (2022). ‘Virtual policing’, trust and legitimacy in Terpstra, J. et al (Eds.), The Abstract Police: Critical reflections on contemporary change in police organisations (213-238)

[xxvii] Fontes, C. and Perrone, C. (2021). Ethics of surveillance: harnessing the use of live facial recognition technologies in public spaces for law enforcement Institute for Ethics in Artificial Intelligence, Technical University of Munich

[xxviii] Aston,E. (2023). Independent review – Independent advisory group on new and emerging technologies in policing: final report Scottish Government

[xxix] Almeida, D. et al. (2022). The ethics of facial recognition technologies, surveillance, and accountability in an age of artificial intelligence: a comparative analysis of US, EU, and UK regulatory frameworks AI Ethics, Vol 2, 377–387

[xxx] See for example the following media articles: BBC News (2020). FBI worried that Ring doorbells are spying on police; The Independent (2024). Amazon’s Ring stops programme to hand doorbell camera footage to police; BBC News (2023). Video doorbells: Police champion them but do they cut crime?

[xxxi] McDonald, N. et al (2020). Privacy and Power: Acknowledging the Importance of Privacy Research and Design for Vulnerable Populations in Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems (CHI EA ’20), 1-8. Association for Computing Machinery

[xxxii] Radiya-Dixit, E (2022). A Sociotechnical Audit: Assessing Police use of Facial Recognition Minderoo Centre for Technology and Democracy

[xxxiii] Mansfield, T. (2023). Facial Recognition Technology in Law Enforcement, Equitability Study Final Report National Physical Laboratory

[xxxiv] Laufs, J. and Borrion, H. (2022). Technological innovation in policing and crime prevention: Practitioner perspectives from London International Journal of Police Science & Management, 24(2), 190-209.

[xxxv] Aston, E. et al. (2021). Information sharing in community policing in Europe: Building public confidence European Journal of Criminology, 20(4), 1349-1368

[xxxvi] Aston, E. et al. (2022). Technology and Police Legitimacy in Verhage, A. et al. (eds) Policing in Smart Societies. Palgrave’s Critical Policing Studies. Palgrave Macmillan

[xxxvii] Wells, H. et al. (2022). ‘Channel shift’: Technologically mediated policing and procedural justice International Journal of Police Science & Management, Volume 25, Issue 1.

[xxxviii] Charlesworth, A. et al. (2023). Response to the UK’s March 2023 White Paper “A pro-innovation approach to AI regulation” SSRN

[xxxix] Oswald, M. et al (2024). Ethical review to support Responsible Artificial Intelligence (AI) in policing: A preliminary study of West Midlands Police’s specialist data ethics review committee.

[xl] Science, Innovation and Technology Committee (2024). Governance of artificial intelligence (AI) UK Parliament

[xli] Fussey, P. and Webster, W. (2023). Changes to the functions of the BSCC: independent report (accessible) Biometrics and Surveillance Camera Commissioner, GOV.UK

[xlii] Biometrics and Surveillance Camera Commissioner (2023). Report finds ‘worrying vacuum’ in surveillance camera plans GOV.UK

[xliii] UK Parliament (2024). Data Protection and Digital Information Bill

[xliv] Biometrics and Surveillance Camera Commissioner (2023). Resignation of the Biometrics and Surveillance Camera Commissioner GOV.UK

[xlv] Zilka, M., et al. (2023). Exploring Police Perspectives on Algorithmic Transparency: A Qualitative Analysis of Police Interviews in the UK in EAAMO ’23: Proceedings of the 3rd ACM Conference on Equity and Access in Algorithms, Mechanisms, and Optimization Association for Computing Machinery

[xlvi] Griffiths, C. et al. (2024). The Northumbria University Digital Forensics Project. Digital Forensics within the Criminal Justice System – Use, Effectiveness and Impact Northumbria University report

[xlvii] Sandhu, A. and Fussey, P. (2020). The ‘uberization of policing’? How police negotiate and operationalise predictive policing technology Policing and Society: Vol 31, No 1: Policing and Technology

[xlviii] Rappert, B. et al. (2020). Rationing bytes: managing demand for digital forensic examinations Policing and Society: Vol 31, No 1: Policing and Technology

[xlix] Justice Committee (2022). Court Capacity UK Parliament

[l] University of Warwick, Department of Statistics (2024). Probabilistic Decision Support Systems in the criminal justice system: draft report in Interdisciplinary workshops on Statistics and the Law 3-4 January 2024

[li] The Alan Turing Institute (2023). The use of AI in sentencing and the management of offenders Workshop on 27 February 2023.

[lii] Maths for Justice Virtual Study Group (2023). The role of AI in the justice system; opportunities, dangers and ethics  Workshop 20-22 November 2023. Institute for Mathematical Innovation, University of Bath

[liii] Tasioulas, J. (2019). The Rule of Algorithm and the Rule of Law

Horizon Scan 2024

Emerging policy issues for the next five years.