Project ExplAIn enters its next phase

The Turing and the Information Commissioner’s Office continue to work on their first-of-its-kind guidance on AI explainability

Tuesday 16 Nov 2021

Research area

Explaining decisions made with AI’, a guidance on AI explainability co-produced by The Alan Turing Institute and the Information Commissioner’s Office (ICO), was published on the ICO website in late May 2020. This publication marked the culmination of nearly two years of collaborative effort, which included extensive desk-based research, two citizens’ juries, and several rounds of multi-stakeholder consultation. The result of all this was the most comprehensive practical guide on AI explanation produced anywhere to date. The guidance has been cited many dozens of times in academic papers and by international law firms and AI policy-related blogs, newsletters and websites.

Since the release of ‘Explaining decisions made with AI’, the Turing and the ICO have been working together to undertake further public engagement to help shed light on how organisations are using it in practice, and on how the guidance can be made even more user-friendly and accessible. Phase two of Project ExplAIn has consisted so far of the creation of workbooks to help accompany the guidance, a series of workshops, and an informational video produced by Fable Studios (below) to help communicate the guidance’s key points to a wider audience.

‘Explaining decisions made with AI’ workbooks and workshops

At the beginning of 2021, our project team assembled two workbooks to help support the uptake of the guidance. The goal of the workbooks is to summarise the main themes from ‘Explaining decisions made with AI’ in a non-technical way. Additionally, each workbook has served as the basis of a workshop exercise built around one of two use cases, created to help organisations and individuals gain a flavour of how to put the guidance into practice.

The workbooks have been written to support our second phase of Project ExplAIn, centred on stakeholder outreach and practice-based evaluation. This has included a series of engagement activities held in January 2021 to assess the usability, accessibility and clarity of the guidance, as well as the readiness levels of organisations to put explainable AI principles into practice. In partnership with Manchester Metropolitan University (MMU) and the ICO, two workshops – one with SMEs from advertising, AI development, finance, recruitment, health, education, fraud protection, media and insurance sectors, and a second with public sector organisations – were held virtually. The workshops engaged participants from a variety of different backgrounds, levels of seniority, and roles across the public and private sectors. We are extremely grateful to them for their energy, enthusiasm and tremendous insight.

The workshops were held via Zoom and employed an e-board software solution, vWall, that allowed participants to write comments, ask questions, and engage with the material. The workshops were structured around two hypothetical case studies: one involving an AI-assisted recruitment tool and one based on a machine learning application in children’s social care. These were written to allow participants to apply aspects of the guidance to a specific scenario and to provide real-world examples of how the guidance could be applied. Before each session, the workbooks were sent out to participants with a series of reflection questions to help prepare them for the workshop.

We hope that our workbooks will allow for more widespread use and dissemination of the guidance. The workbooks begin with a truncated form of the ‘Explaining decisions made with AI’ guidance, presenting the four principles of AI explainability, the basics of an explanation-aware approach to AI innovation, and the practical tasks needed for the explanation-aware design, development and use of AI systems. They then provide some reflection questions, which are intended to be a launching pad for group discussion. The appendices of the workbooks are primarily focused on both the workshop setting and the case studies. Appendix A provides a structure for how to use the workbook in a workshop setting, including details on necessary resources, personnel, and recommended timelines. These recommendations are based on the workshops co-hosted with ICO and MMU in January 2021. Appendix B contains the case study, followed by appendix C which consists of a checklist for one or more of the explanation types to be used in tandem with the case study.

Case studies found in the workbooks:

  • The ‘AI-assisted recruitment tool’ case study depicts a company considering the use of an AI-assisted recruitment tool to support HR personnel with future job vacancies. The tool uses a variety of personal and professional criteria to determine which candidates would be the best fit for the organisation. The case study provides detail on the data available to the organisation, model type considerations, and the roles and responsibilities within the organisation. Participants are asked to focus on responsibility explanation, data explanation, and fairness explanation by applying a checklist of tasks to the case study.
  • The ‘Machine learning for children’s social care’ case study describes an organisation contemplating the use of a machine learning algorithm within a children’s social care setting. This algorithm would be used to identify children at risk, in order to determine whether or not they should be taken into care. The case study gives detailed explanations of the variables available and the exploratory data analysis that took place. It then illustrates what a logistic regression model trained on the data would look like and includes feature importance plots to help participants think through how each variable factors into an explanation of the model. Additionally, the case study presents a hypothetical scenario in which the model is applied to a specific family. Participants are asked to focus on rationale explanation by applying a checklist of tasks to the case study.

These workbooks would simply not exist without the commitment and keenness of all our collaborators and workshop participants, and we would like to thank them again for their involvement.

Next in this phase of Project ExplAIn, we plan to build more engagements with SMEs and public sector organisations, and focus on the use of the guidance in practice as well as updating it in keeping with the insights drawn from these valuable engagements.

View the full guidance

We are grateful to EPSRC grants EP/T001569/1 and EP/W006022/1 for making this research possible.

 

Top image: OlegRi / Shutterstock