Algorithmic transparency is both achievable for policing and could bring significant rewards, a new research report finds.
Led by Dr Marion Oswald, MBE, the research explores the implications for police forces of participation in the Government’s new Algorithmic Transparency Standard. Based on interviews with police personnel and representatives of commercial providers, the research was conducted in parallel to the piloting of the Standard by the Cabinet Office and the Centre for Data Ethics and Innovation. The aim of the Standard is to ‘promote trustworthy innovation by providing better visibility of the use of algorithms across the public sector, and enabling unintended consequences to be mitigated early on.’
Rewards outweigh risks
Interviewees generally thought that the rewards for the police of a carefully tailored Standard implemented at the right stage of algorithmic development outweighed the risks. Participation in the Standard provides an opportunity for the police to demonstrate the legitimacy of technology use and build earned trust. As one research participant commented:
‘If people are worried, there's an opportunity there as opposed to a risk, there's an opportunity to be open. There's an opportunity to be transparent. There's an opportunity to explain, to reduce concerns.’
The Standard could also be used to develop increased sharing among police forces of best practices (and things to avoid). Research participants were keen for compliance with the Standard to become part of a system to drive reflective practice across policing around the development and deployment of algorithmic technology. This could enable police to learn from each other, facilitate good policy choices and decrease wasted costs. For example, one interviewee suggested:
‘It might also be possible to create a peer reviewing mechanism of sorts. I feel that without this oversight, we will be missing the main benefit of transparency; the ability to ensure that algorithms used in the public sector are up to the task, and are being built properly, with proper attention to the data, and to the drift that occurs with algorithms.’
In order to contribute to improving the quality of policing technology, the report concludes, the Standard should be linked to methods of oversight and promotion of best practice on a national basis. Otherwise, the Standard may come to be regarded as an administrative burden rather than a benefit for policing.
Transparency concerns mitigated
The research recognises the confidentiality concerns around policing contexts and tradecraft, where revealing technical details could provide an advantage to criminals. Ways that these concerns could be mitigated include a non-public version of the Standard for sensitive applications and tools, which would be available for review to bodies with an independent oversight function.
To support police compliance with the Standard, supplier responsibilities – including appropriate disclosure of algorithmic functionality, data inputs and performance - should be covered in procurement contracts and addressed up front as a mandatory requirement of doing business with the police.
The report also suggests a number of areas for amendment and improvement, which could improve the Standard for the benefit of all participants. These include clarification of the scope of the Standard and the stage of project development at which the Standard should apply. A lighter touch version of the Standard – ‘Standard-Lite’ – is proposed for early or trial stage projects.
Dr Oswald said: ‘Transparency has not traditionally been associated with policing technology, for understandable reasons, but this is changing. Our interviewees – both from policing and the commercial sector – recognised how transparency can bring rewards. It can enable the public to understand more about how the police are using technology and the reasons for this. It can also enable the police themselves to learn from each other and improve what they are doing.’
The full report ‘The UK Algorithmic Transparency Standard: A Qualitative Analysis of Police Perspectives’ by Marion Oswald (The Alan Turing Institute, Northumbria University), Luke Chambers (Northumbria University), Ellen P. Goodman (Rutgers University), Pamela Ugwudike (Southampton University) and Miri Zilka (Cambridge University) is available to read here.