All across the UK, local authorities are well-positioned to draw on insights from data-driven technologies to improve public services and to better the lives of their constituents. Seizing this opportunity, however, depends not only on scientific ingenuity. Just as important—and complimentary to good data science—are practices of responsible and ethically-informed innovation.

The Institute’s Ethics Fellow David Leslie and I, representing the Turing’s thriving public policy programme, had the pleasure of being hosted by the Greater Manchester Combined Authority (GMCA) for a workshop with public authorities on developing ethical innovation. The workshop had a particular focus on machine learning applications for the public good.

One of the public policy programme’s aims is to bring researchers and policy makers together in order to develop innovative ways to provide public services. This workshop, a first of its kind pilot, achieved this by interactively exploring the Turing’s guidance, Understanding artificial intelligence ethics and safety. This was developed in partnership with the Office for AI and the Government Digital Service, and is now part of the Office for AI’s Guide to using artificial intelligence in the public sector.

The workshop convened 15 experts from across four local authorities in Greater Manchester (an area with a combined population of almost 3 million) eager to learn about safe and ethical use of data science in their work. At the start of the workshop, the Turing’s Ethics Fellow David Leslie welcomed participants by reaffirming the important role that public authorities have in caring for the public interest and stewarding citizens’ data. This underscored the importance of making public authorities better informed creators and users of innovation.

There was a broad agreement that data science could improve public services, for example by targeting services and support where they are most needed or identifying patterns indiscernible to humans, thus informing policy. However, workshop participants raised questions with ethical implications that they wanted to resolve together, both during and after the workshop, such as how to involve citizens in innovation or utilise available data in a responsible manner.

The workshop introduced practitioners across the public sector to key concepts, values, and principles of digital ethics, and explored their application in the practice of public services innovation. Through interactive sessions, participants deliberated and answered questions around the ethics of the data, design, and use of machine learning in their work. In addition, the workshop brought together stakeholders across different local authorities, establishing connections for further discussions beyond the event.

Participants were eager to understand how they can ‘do’ ethical innovation in practice. They discussed the availability and quality of the data sources they could use and considered what that meant for the possibilities of ethical machine learning. They explored the importance of communicating with and training users of algorithmic tools and agreed with the need for increased transparency and citizen involvement in innovation processes, in order to inform innovation and build public trust.

Finally, this workshop showcased the importance that both the Turing and Greater Manchester place on ethical innovation and was one of many steps taken forward towards responsible data science for the public good. The conversation created about the needs and challenges faced by practitioners in local authorities will also be used to inform future reiterations of the Turing’s guidance, grounding it in real-world applicability and bringing it closer to its intended users: public authorities and public bodies.

The workshop took place in November 2019. For more information on the guidance or the workshop, please contact [email protected].

Cover photo credit: Christina Hitrova