Annual Report 2021–22 Section 1 Highlights of the year 3 Section 2 Trustees’ and strategic report 51 Section 3 Financial statements 75   Section 1 1.1 Chair’s report 4 1.2 Institute Director’s report 6 1.3 Research highlights of the year 11 1.4 Partnerships and collaborations 23 1.5 Equality, diversity and inclusion 27 1.6 Growing our network 30 1.7 Working with policy makers 34 1.8 Public engagement 39 1.9 Convening academia, industry and policy makers 43 1.10 Skills and training 48   Section 1.1 Chair’s report I have been proud to serve as the first Chair of the Institute’s Board of Trustees. It has been a privilege to work with the Engineering and Physical Sciences Research Council (EPSRC), our founding and new university partners, our strategic partners, and others to build a thriving, world-class research institute. Our unique view of data science and AI is based on the breadth of expertise available to us at the Institute, stretching from the computational sciences, to engineering, to social sciences and beyond. As I reflect on my seven years at the Institute, it is clear to me that there has been rapid progress, underpinned by unstinting support from across the UK’s data science and AI community. This impact has included playing an important part in responding to the pandemic, leading in areas such as data safe havens and helping to drive the emergence of exciting areas such as digital twins. Elsewhere, we have brought together some of the top data science talent in the UK through our Data Study Groups, and launched AI UK, which has quickly become an inspiring showcase attracting the biggest names in the AI ecosystem. The Institute has also built many exciting new partnerships and collaborations with industry and the third sector, and has played a pivotal role advising government bodies on challenges such as AI explainability, data ethics and online harms. The UK has continued to establish itself as one of the world’s key research hubs and has done remarkably well to maintain its reputation for groundbreaking research, innovation, and borderless collaboration. I have seen the Institute launch a range of research projects and interest groups, and establish a diverse range of programmes with our strategic partners. We are now seeing the impact of this research emerge: the Institute has become an innovator in the analysis and application of data science and AI. We have laid the foundations for a future of world-leading academic excellence by convening academia, industry, the government and the third sector on a truly national scale. We are now at a critical time as the government has launched its first National AI Strategy, and the Institute is playing a leading role in its implementation while developing its own strategic response. “We are now at a critical time as the government has launched its first National AI Strategy, and the Institute is playing a leading role in its implementation while developing its own strategic response.” I am delighted to welcome our new Chair Douglas Gurr. Under his leadership, I am confident that the Board of Trustees will ensure the continued success of the Institute during this important stage in its development. Finally, my thanks go to Adrian Smith, the Board, everyone at the Institute, and our vibrant network of partners and community of collaborators. I look forward to seeing the Institute continue to drive the kind of research and innovation that will be vital to the future of our environment, security, wellbeing, society, and our economy. Howard Covington Chair of the Board of Trustees 2015–2022   Section 1.2 Institute Director's report This year’s launch of the UK’s first-ever National AI Strategy is a landmark moment and an important opportunity to advance the UK’s international reputation. As the national institute for data science and artificial intelligence, we are now actively supporting this ambitious agenda to help establish the UK as a global AI superpower. One of the Strategy’s key priorities is health and wellbeing, and so I was delighted that the Institute launched a five-year strategic partnership with global healthcare company Roche. The collaboration will enable us to develop new approaches to personalised healthcare. The Institute also welcomed the establishment of a new Research Support Facility funded by the National Institute for Health and Care Research (NIHR) to help improve understanding of multiple long-term health conditions. These new partnerships have emerged as we continue to reflect on the pandemic and think about how data science and AI can help to improve health, wellbeing and economic outcomes. Our new strategic partnership with the Office for National Statistics will support this by delivering crucial, near-real-time statistics to help track changes in the UK economy. Engaging with UK universities is an important part of our role as a national institute. In pursuit of new collaborations, universities across the UK were among the first-ever successful applicants to the Turing Network Development Awards. These innovative awards were made to institutions from Scotland, Wales, Northern Ireland and England – each with its own particular area of proven research excellence. The Institute also signed a new agreement with DSO National Laboratories in Singapore. The purpose of the strategic partnership is to advance the state-of-the-art knowledge and capability in key data science and AI challenge areas, contributing to public good and the national research interests of Singapore and the UK. The Institute has enjoyed rapid growth and our new strategic approach to delivering impact around national strategies (Turing 2.0) will represent the next stage of the Institute’s development, building on our place within the science and innovation landscape. This is a pivotal moment in the Institute’s evolution, and it is being steered by our first Chief Scientist, Mark Girolami. Mark is already nurturing new research and innovation strategies, encouraging cross-disciplinary working, and engaging with wider academic, government, commercial, business and industrial sectors. “Turing 2.0 will represent the next stage of the Institute’s development, building on our place within the science and innovation landscape.” It is great to see the Institute adding to the national scientific output through a distinctive portfolio of world-class research projects. As you read our selection of 2021-22 highlights, you will see how our array of talented researchers are at the heart of novel projects across a range of domains, including robotic exoskeletons helping people to walk, an algorithm to reduce the shipping industry’s carbon footprint, and the fascinating use of new tools for image analysis. Finally, I would like to thank Howard Covington for his unstinting support and for being a trusted voice and ambassador for the Institute. Howard can be proud of what he has helped the Institute to achieve during a complex and challenging period. My thanks go to the Board, the business team, the connected research community and our partners for another year of innovation and scientific excellence. Adrian Smith Institute Director and Chief Executive   Strategic partners The Turing’s unique position at the interface of academia, business, third sector and the public sector distinguishes us from other research institutions. In addition, it creates a wealth of potential collaborative opportunities, in particular through our dynamic relationships with our strategic partners. These relationships not only contribute to what sets the Turing apart from many other research organisations, but also drive our innovation and impact. As well as delivering ambitious programmes of research, our strategic partnerships seek to build meaningful connections between academic excellence and real-world challenges in business, government, skills and engagement.   Founding partners The Turing’s founding partners are the University of Cambridge, University of Edinburgh, University of Oxford, University College London, University of Warwick and EPSRC. Answering a national need for investment in data science research, they formed the Turing as a joint venture in 2015, following an open competition run by EPSRC. Each founding university has appointed a Turing University Lead who acts as an interface between the Turing and the founder university.   University partners Our university network, expanded from the Turing’s five founding universities, enables the Institute to conduct even more ambitious collaborative research. Section 1.3 Research highlights of the year Supporting the government’s drive Working towards a safer internet towards trustworthy AI Pioneering new tools for image analysis Reducing the carbon footprint of the shipping industry Improving lives in Bradford Optimising robotic exoskeletons to help people walk Developing state-of-the-art defences for computer networks The Turing Way: blazing a trail for reproducible research Providing COVID-19 expertise to the UK government Living with Machines: harnessing the hive mind “I am delighted to introduce the Turing’s research highlights of the year – a whistlestop tour through some, but by no means all, of our most successful projects in 2021-22. These case studies demonstrate the brilliant diversity of our research, from history, finance and robotics to healthcare, environment and online harms. They showcase the Turing’s value in convening interdisciplinary teams from across academia, industry and the public sector – an approach that is reaping rewards through discernible real-world impact. We are committed to building on this success as we look ahead to the Turing’s next phase, in which we will place even greater emphasis on the effective combination of research and innovation to tackle challenges of societal importance.” Mark Girolami Chief Scientist, The Alan Turing Institute Supporting the government’s drive towards trustworthy AI Related programmes and teams Finance and economics As Al systems become ever more sophisticated, they are making and supporting a growing number of decisions in areas ranging from finance and healthcare to employment, law and transport. These systems offer huge potential benefit to society and the economy, but they come with risks. Is the system free from bias? Does it comply with regulations? Are there privacy and security concerns? To help counter these risks, the government’s Centre for Data Ethics and Innovation (CDEI) is leading the development of an ‘AI assurance’ industry that will provide tools and services to enable businesses, consumers and regulators to verify that AI systems are effective, trustworthy and legal. The Turing has become involved in this area through Lukasz Szpruch, Programme Director for Finance and Economics, who was an advisor to a pilot project between University College London and the CDEI. The result of this project was a 2021 paper that summarises the key concepts of AI assurance and identifies some of the difficulties in this area, such as how to create standards for algorithm safety when their uses and risks vary so much between different sectors. This paper has in turn informed the ‘Roadmap to an effective AI assurance ecosystem’, a first-of-its-kind report published by the CDEI in December 2021, which sets out the steps required to build an AI assurance ecosystem in the UK. Lukasz continues to work in this area as part of Project FAIR – a collaboration between the Turing and partners including HSBC and Accenture (also see page 25), a goal of which is to develop clear guidelines for AI assurance within the finance sector. “For AI to be used responsibly, ethical principles need to be translated into real-world practice. This work with the CDEI will help guide the UK towards a thriving industry dedicated to that goal.” Lukasz Szpruch Programme Director for Finance and Economics, The Alan Turing Institute Pioneering new tools for image analysis Related programmes and teams Data science for science and humanities Research engineering AI for science and government Image analysis is central to myriad research areas, from climate change and human history to medicine, microbiology and astronomy. Computer vision techniques are used to automatically extract useful information from images, saving researchers time and energy. However, many of the researchers who are analysing images are not computer vision experts: there is a knowledge gap between those developing the algorithms and those whose research can benefit. Two new tools developed at the Turing are aiming to bridge this gap. The first, called MapReader, is a free, open-source software package written in the Python programming language. Created as part of the Living with Machines project (also see page 22), it is the first tool that enables historians to automatically find features in maps on a large scale. MapReader’s key feature is to divide maps into patches – a more efficient approach than considering every map pixel individually. To demonstrate the tool’s capability, the team has used it on over 16,000 19th century Ordnance Survey maps of the UK (digitised by the National Library of Scotland) to identify buildings and railway infrastructure, creating a visualisation of the growth of urban areas and the rail industry after the Industrial Revolution. The second tool, called scivision, is a more general-purpose Python toolkit for analysing scientific imagery. Developed by a team at the Turing, the aim is to create a searchable, open-source catalogue of computer vision algorithms and image datasets for the wider research community – enabling algorithm developers to find image creators, and vice versa – as well as an interface that allows users to easily load the datasets and run the models. To showcase scivision’s potential, the team is developing a series of use cases, including classifying plankton species in images of ocean samples (this began as a Data Study Group at the Turing with the Centre for Environment, Fisheries and Aquaculture Science (Cefas), which is now testing the resulting algorithm on its research ship), and adapting the MapReader tool to categorise the physical features of plants, which could, for example, help researchers quantify how plants respond to different climate conditions. “What’s really powerful about these tools is the way they are combining knowledge from the humanities and the sciences. It’s a great example of cross-pollination within the Turing.” Katherine McDonough Senior Research Associate and MapReader team member, The Alan Turing Institute Reducing the carbon footprint of the shipping industry Related programmes and teams Data-centric engineering The oceans are the superhighways of international trade, with about 80% of global goods carried by sea. But this comes at an environmental cost: the shipping industry emits around a billion tonnes of greenhouse gases – mostly CO2 – per year. This is about 3% of the global greenhouse gas emissions caused by human activity. Shipping companies are looking for ways to reduce their carbon footprint, and one potential solution lies in optimising ships’ routes so that they burn less fuel. To this end, a team led by the Turing’s Adam Sobey has developed voyage optimisation software that plans the most fuel-efficient route through the waves. This is a collaboration with UK company Theyr, which specialises in supplying high-resolution weather and ocean data to the maritime sector. Theyr’s data feeds directly into the software, which uses what is known as a ‘genetic algorithm’, inspired by Darwin’s theory of evolution by natural selection. The algorithm creates a population of possible routes, and then mathematically combines (‘mates’) pairs of the most successful routes (i.e. those which arrive on time while using lower amounts of fuel and avoiding poor weather/ ocean conditions). By repeating this process over multiple ‘generations’ of routes, the algorithm quickly arrives at an optimal solution. The software is now being used by Euronav, a crude oil tanker company that has a fleet of 75 ships, and the researchers estimate that it will reduce fuel use by around 5% compared with other routing algorithms. Large vessels typically burn 50-75 tonnes of fuel per day, so this could save as much as £2,000 per day per vessel. And with this, of course, comes a significant environmental benefit: reduced fuel means reduced emissions. “By optimising our shipping routes, this innovative software has the potential to significantly reduce both our fuel costs and our environmental impact. We are now rolling it out across our entire fleet of tankers.” Patrick Declerck Operations Manager, Euronav Improving lives in Bradford Related programmes and teams Urban analytics AI for science and government Holme Wood in south-east Bradford is one of the most deprived housing estates in the UK. Historically, initiatives aimed at improving people’s lives there or in other deprived areas of Bradford were typically ‘top-down’ national policies, One of the team’s models, for example, analyses access to Research highlights of the year food outlets in the estate, showing that while there are plenty of fast food outlets, it is difficult to access healthy, affordable food. Residents are now in direct conversations with local service providers about how fresh food could be more easily distributed to the community, such as via food banks at schools. The team has started to expand its methodology to other deprived areas of Bradford, and has interest from central government in expanding it nationally. applied locally. However, these have had limited success: blanket policies are unlikely to be equally effective in every area they are applied to, even within the same city. A collaboration between the Turing and Leeds Institute for Data Analytics is using data science to better understand deprivation in Holme Wood and to generate tailor-made policies. The researchers, led by Turing Fellow Mark Mon-Williams, have developed a two-pronged approach. First, they build statistical models that pull in data on residents’ education, social care, employment, housing and transport access, as well as data from the world leading Born in Bradford study, which is tracking the health of over 13,500 children born in the city between 2007 and 2010. Second, the team presents the findings of these models to Holme Wood residents, putting them at the centre of discussions about potential solutions. “This project is combining data science and lived experience to understand the challenges and opportunities faced by those living in some of Bradford’s most deprived places. I am now using the team’s methodology to help improve support to children with neurodiverse conditions, and their families.” Ava Green Associate Director for Autism, Bradford District Care NHS Foundation Trust Optimising robotic exoskeletons to help people walk Related programmes and teams Artificial intelligence Being able to walk with the aid of a robotic exoskeleton was once the preserve of sci-fi films. Recently, though, these wearable robots have started appearing in medical settings, helping to restore the mobility of people who have difficulty walking, whether through illness, injury or disability. However, walking is a complex task, and exoskeletons, which use motorised joints to replicate human motion, do not always deliver a substantial reduction in effort. A project at the University of Edinburgh, funded by the Turing and Honda, is aiming to improve their efficiency. The researchers have developed an exoskeleton controller that uses a computer model of the human musculoskeletal system to estimate the wearer’s energy levels in real time, allowing the controller to adjust the exoskeleton’s response and reduce the wearer’s exertion. The team is now expanding this work to accommodate a second person interacting with the exoskeleton wearer. This scenario is relevant to care homes, for instance, where the wearer might be assisted by a human carer as they walk or stand up. Or in a factory, someone might wear an exoskeleton to help with a heavy-lifting task that also involves another worker. The researchers’ new exoskeleton controller will adapt not just to the wearer, but also to the other person’s assistance. All of this research will help to make exoskeletons more useful in real-world settings. Eventually, perhaps we will be using exoskeletons in the way that people now use electric bikes, counting on them to give us a motorised boost when the going gets tough. “Exoskeletons hold great promise in restoring people’s mobility. Our work is helping to take this research from the lab into the real world.” Sethu Vijayakumar Programme Co-Director for Artificial Intelligence, The Alan Turing Institute and Professor of Robotics, University of Edinburgh Working towards a safer internet Related programmes and teams Public policy AI for science and government For all its positives, the internet is a place where hate speech, harassment, extremism and misinformation thrive. The sheer amount of harmful content, and the speed with which it travels, make it difficult to keep track of and understand the problem. Furthermore, policy makers are faced with the challenging task of shielding users from harm while protecting freedom of speech. In response to these issues, the Turing’s public policy programme set up an Online Safety Team – a group of researchers who are using data science and AI to provide regulators and policy makers with the tools and evidence they need to detect and tackle toxic online content. In March 2022, the team launched the Online Harms Observatory, a collaboration with the Department for Digital, Culture, Media & Sport (DCMS) that identifies and monitors harmful content in real time using a custom-built dashboard. Its first application, which has been used for research commissioned by Ofcom, is tracking abuse on Twitter directed at Premier League footballers. Underpinned by algorithms developed at the Turing, the Observatory is set to be rolled out fully to regulators and policy makers in the coming months. Also this year, the Turing’s Bertie Vidgen, who played an instrumental role in setting up the Online Safety Team, provided direct input to the government’s Online Safety Bill – a landmark piece of legislation which is currently going through Parliament, aimed at protecting internet users and holding tech companies to account. Bertie was a specialist advisor to the Joint Committee on the Draft Online Safety Bill, the result of which was a 200-page report with 127 recommendations, of which the government accepted 66. As harmful content touches ever more lives, the team will continue to work closely with stakeholders such as DCMS and Ofcom to help create a safer online environment for all. “The internet should feel like a safe space, not a battleground. The Turing’s Online Safety Team is pioneering the use of data science and AI to detect, understand and counter online hate and abuse.” Bertie Vidgen Outgoing Head of Online Safety The Alan Turing Institute Developing state-of-the-art defences for computer networks Related programmes and teams Defence and security Every computer network is vulnerable to cyber criminals and hackers. Detecting attackers within a network is a complex task, and even skilled operators struggle to keep track of them all. In 2020, for example, the average amount of time attackers spent inside compromised networks was an estimated 24 days. During this so-called ‘dwell time’, hackers can perform malicious activities such as stealing data, installing malware and disrupting services. To bolster network defences and speed up response times, computer security organisations are increasingly looking to autonomous systems. Researchers led by the Turing’s Vasilios Mavroudis and Chris Hicks are exploring a technique called reinforcement learning (RL) – a field of artificial intelligence in which computer algorithms learn by solving problems through trial and error, with the goal of maximising a specified reward (in this case, maximising a network’s security). In February 2022, the team won the first Cyber Autonomy Gym for Experimentation (CAGE) challenge, run by The Technical Cooperation Program (TTCP) – an international organisation focused on defence science and technology. The challenge tasked teams with defending a network against two simulated ‘advanced persistent threat’ agents – attackers that typically remain undetected inside networks for weeks or even months. The Turing team programmed three RL agents that work together to defend the network by, for example, removing attackers, laying traps for attackers (by creating decoy network hosts), or terminating malicious processes. The team has made its code publicly available so that others can benefit from and expand upon this work, and it plans to use its success to kick-start a deeper programme of research around RL and network defence. “Dstl is proud to have been part of this challenge with the Turing. Through cutting-edge reinforcement learning techniques, this research is taking us another step towards achieving autonomous cyber defence decision-making.” Josh C UK CAGE representative and Principal Data Scientist, Defence Science and Technology Laboratory (Dstl) The Turing Way: blazing a trail for reproducible research Related programmes and teams Tools, practices and systems Research engineering AI for science and government Since its launch in 2019 as a collaboratively written online guide to reproducible data science, The Turing Way has blossomed into a comprehensive handbook with four extra guides on project design, communication, collaboration and ethical research. This year, the project has continued to expand, fostering a far-reaching community that is banging the drum for reusable research. Managed by a core team of ten at the Turing, The Turing Way now boasts over 320 contributors, who have collectively written over 260 pages across 54 chapters. Contributors come from countries including India, Australia, South Africa, the Netherlands, Argentina and the USA, and have backgrounds across academia, government and industry. The Turing Way hosts a smorgasbord of virtual events for contributors to share ideas: weekly co-working calls, fortnightly collaboration cafés, and twice-yearly, multi-day Book Dash events. New for 2021 were monthly Fireside Chats: informal webinars on topics including research infrastructure, multilingual data science and community collaboration. Members of the core team also delivered 11 training workshops and 39 conference talks during 2021, and the handbook is being translated into Spanish, Arabic, Portuguese and Chinese. All of this is helping to put The Turing Way in front of as many students, researchers, project leaders and educators as possible. The project is increasingly being recognised in the wider science and tech community. This year, The Turing Way was referenced in the Mayor of London’s Emerging Technology Charter, and it is inspiring several similar resources, including the FAIR Cookbook for the life sciences, a guide to running citizen science projects for research libraries, and the Turing’s own The Environmental Data Science Book. “We’re so excited to see organisations using The Turing Way to inform and inspire their own projects. We’re spreading the word about the benefits of reusable and transparent research.” Malvika Sharan Senior Researcher in Tools, Practices and Systems, and co-lead of The Turing Way, The Alan Turing Institute Providing COVID-19 expertise to the UK government Health and medical sciences Research engineering The Turing-RSS Health Data Lab, a partnership established in August 2020 between the Turing and the Royal Statistical Society, has continued to provide invaluable insights to the government’s UK Health Security Agency, which is responsible for public health protection in the UK. The Lab aims to develop statistical and machine learning techniques to answer policy-relevant questions about COVID-19. It is made up of over 35 people in research, leadership and operational roles from more than 10 institutes, including Imperial College London, MRC Harwell and the University of Oxford. In December 2021, Lab researchers published a paper in Nature Microbiology describing a statistical framework that combines multiple sources of COVID-19 test data to provide more accurate estimates of local virus prevalence. This was followed in February 2022 by a statistical analysis from the Lab, published in The Lancet Regional Health, which found that deprived areas in England with higher proportions of non-White people were associated with higher COVID-19 infection rates. However, the strength of this association varied over the course of the pandemic and for different ethnic subgroups, highlighting the importance of continual monitoring when developing policies aimed at eliminating health inequalities. Other work at the Lab has included a project looking at the potential of diagnosing COVID-19 and other diseases by acoustically analysing someone’s speech or coughs, and a statistical model that uses incomplete COVID-19 test data to estimate (‘nowcast’) the total number of positive tests. Meanwhile, a recent paper describes the Lab’s overarching approach to statistical modelling, setting out a framework that will allow other research teams to rapidly build effective, data-driven models in response to future health emergencies. “The Alan Turing Institute and the Royal Statistical Society came together at pace in response to the COVID-19 pandemic, providing world-class independent research and modelling expertise to the UK government. The partnership has played a highly valuable role in developing and further enhancing the data science and advanced analytical capabilities within the UK Health Security Agency, both in responding to COVID-19 and in tackling new and existing threats to UK health.” Johanna Hutchinson Director of Analytics and Data Science, UK Health Security Agency Living with Machines: harnessing the hive mind Related programmes and teams Data science for science and humanities Research engineering What did Victorians mean by the word ‘machine’? That is one of the questions that the Living with Machines project has been grappling with over the past year, using the power of crowdsourcing to glean information from 19th century newspapers. Living with Machines is a five-year collaboration between the Turing, the British Library and four UK universities. It aims to shed new light on the human impact of the Industrial Revolution, using the latest computational tools to analyse historical newspapers, books, maps and census records. The crowdsourcing initiatives, which officially launched on the Zooniverse platform in December 2020, ask volunteers to pore over digitised newspapers and answer questions posed by the researchers. In two of the tasks, participants were asked to read articles and add information about any mentioned machines. Some of the results were surprising: objects such as prams, weighing scales and even scaffolding were all referred to as machines. More than 1,700 volunteers have taken part in the various crowdsourcing tasks, which have also included finding newspaper stories about industrial accidents, and classifying newspaper snippets as ‘advertisement or not’. The researchers are currently co-curating an exhibition at Leeds City Museum, opening in July 2022, that will spotlight some of the initial results. As well as allowing the public to engage directly with the research, these crowdsourcing tasks are providing the Living with Machines team with precious data with which to train its machine learning algorithms. Over time, the aim is to use these algorithms to automatically extract information from digitised newspapers en masse, providing deeper insights into what it was like to live through the Industrial Revolution. “Historical newspapers are a treasure trove of information. With tens of millions of pages now digitised, we need to develop new methods to analyse them at scale. Our collaboration with the Turing is giving us the tools to do that.” Mia Ridge Living with Machines Co-Investigator and Digital Curator for Western Heritage Collections, British Library Section 1.4 Partnerships and collaborations Roche The Turing this year launched a five-year strategic partnership with the Swiss healthcare company Roche – our first pharmaceutical partner. The collaboration will address critical research questions in the area of personalised healthcare, including why the same treatment can affect different patients in different ways. Teams of data scientists from the Turing and Roche will develop advanced analytics to generate insights from multiple types of healthcare data, with the aim of understanding the impact of patients’ individual characteristics on their disease and treatment outcomes. Together, we hope to accelerate the evolution of personalised healthcare and ultimately enhance clinical care for people around the world. Office for National Statistics A new strategic partnership between the Turing and the Office for National Statistics (ONS) is aiming to produce near-real-time statistics to help track changes in the UK economy. The collaboration, which is initially running for two years, will see ONS analysts and data scientists work closely with Turing researchers on a range of projects, including economic nowcasting (creating models that track changes in retail prices, household spending and income at a local level), and developing tools to allow the sharing of sensitive data while preserving privacy. More detailed and accurate monitoring of the UK economy promises to help inform policy and in turn improve people’s economic wellbeing. DSO National Laboratories The Turing has formed a strategic partnership with DSO National Laboratories, Singapore’s national defence research agency, which is investing £3 million over three years in research aimed at improving security in both the online and physical worlds. Projects include creating an AI model to detect online abuse in multiple languages and dialects (Singlish, Malay and English), and using natural language processing to automatically extract location information from social media posts, which could be used to track terrorists and other security risks. NIHR Research Support Facility A new Research Support Facility based at the Turing will help to improve the understanding of multiple long-term health conditions. The facility, a partnership with Swansea University, the University of Edinburgh and MRC Harwell, is supported by a £3 million investment from NIHR, which is funding up to £23 million of research to better identify, prevent and treat clusters of multiple long-term conditions. With two-thirds of adults aged over 65 expected to be living with multiple long-term conditions by 2035, this is a much-needed initiative. Our Research Support Facility will offer advanced AI and data science support to the research projects funded by this call. Prosperity partnerships This year, the Turing was the recipient of two prosperity partnerships – grants from EPSRC that fund research collaborations between UK-based businesses and academic institutes. – Project Bluebird is a partnership between the Turing and NATS – the UK’s leading air traffic control company. The research vision is to deliver the world’s first AI system that collaborates with humans to control a section of airspace in live trials, which will put the UK at the forefront of technical advances in this sector. More broadly, the project will explore the use of technologies such as digital twins and machine learning in air traffic control, helping to modernise UK airspace as the country’s aviation industry aims to achieve net zero carbon emissions by 2050. – Project FAIR is a partnership between the Turing and organisations including HSBC and Accenture, dedicated to developing a framework for responsible adoption of AI in the financial services industry. AI technologies have the potential to unlock significant growth in this sector through, for example, personalised products, improved cost efficiency and more effective management of security risks. This interdisciplinary research programme will seek to address the key challenges, exploring how the financial services industry can build AI systems that are fair, robust and transparent without compromising the privacy and security of service users. Royal United Services Institute The Turing’s defence and security programme this year began a collaboration with the Royal United Services Institute (RUSI) – a prestigious UK defence and security think tank. Working with RUSI’s Technology and National Security Programme, we are co-writing two policy papers, the first of which (published in June 2022) aims to establish an independent evidence base to inform future government policy development around the use of publicly available information for national security purposes. Rosalind Franklin Institute A new operational alliance with the Rosalind Franklin Institute will use AI to tackle a major bottleneck in the realm of molecular biology. Directly observing the molecular structure of proteins within their native cells promises to transform our understanding of human health and biology, and this is now possible through a microscopy technique called cryo-electron tomography (CryoET). However, off-the-shelf analysis tools struggle to identify molecular structures in complex CryoET datasets. Our collaboration will develop specialist machine learning tools to detect structures in data from the Franklin’s latest CryoET instrument. The ability to observe proteins in their natural environment could ultimately lead to new ways to treat diseases associated with protein misfolding, such as Alzheimer’s and cystic fibrosis. International engagement The Turing continues to build relationships with organisations and institutes around the globe. This year, we have established new partnerships with Switzerland’s Roche (page 24) and Singapore’s DSO National Laboratories (page 24), continued our collaboration on the world’s first 3D printed bridge in Amsterdam (page 40), and led a research project with the Global Partnership on Artificial Intelligence (GPAI), involving partners in countries including Chile, Uganda and Pakistan (page 38). ‘Do great minds think alike?’ is the Turing’s first international seminar series Our ‘Trustworthy digital infrastructure for identity systems’ project, funded by the Bill & Melinda Gates Foundation, has moved into its active research phase, developing technologies aimed at enhancing the privacy and security of national digital ID systems. The team is helping the Indian organisation MOSIP (Modular Open Source Identity Platform) in a number of areas, including identity fraud detection and secure identity authentication. The team has also begun collaborations with the governments of India and Philippines – two countries that are at different stages of their digital ID journey. In September 2021, the project held its first Turing trustworthy digital identity conference, which saw industry leaders, academics and policy makers from 29 countries discuss ways to minimise the ethical risks of digital IDs. Also this year, the Turing began a collaboration with China’s Beijing Academy of Artificial Intelligence through a two-day workshop on AI for environment, climate and sustainability in March 2022. Supported by the British Embassy in Beijing, the University of Exeter and Peking University, the workshop brought together experts from the UK and China to share knowledge on the use of AI to address environmental challenges. The outcome was a list of recommendations for future joint activities. We established a memorandum of understanding (MoU) with Visual Intelligence – a Norwegian research centre that specialises in AI-powered image analysis. Our partnership will explore the use of deep learning to extract useful information from images to address global issues such as threats to wildlife from climate change. Also in northern Europe, we built on our MoU with the Finnish Center for Artificial Intelligence (FCAI) with a one-day event in February 2022, at which researchers from the Turing and FCAI shared their machine learning work. We also developed new links with Italy through the UK-Italy robotics and AI research collaboration workshop in Rome in March 2022, which convened AI and robotics researchers from the two countries to identify priority topics for future collaborations. Finally this year, the Turing launched its first international seminar series – ‘Do great minds think alike?’ – which looks at how AI can transcend geographical and disciplinary boundaries to address some of today’s most pressing problems. Our three online events so far have explored how tech can help to tackle modern slavery, how AI can address the humanitarian cost of climate change, and the hidden human labour behind AI systems. Section 1.5 Equality, diversity and inclusion The Turing has continued to work towards its equality, diversity and inclusion (EDI) commitments this year, through our three roles as a national body, a research institute and an employer. Our role as a national body In September 2021, we launched our first EDI strategy and action plan. The strategy lays out our commitment to EDI, our achievements to date and our ambitions for the future. The action plan will guide our work and progress in this area – and also hold us accountable. Taken as a whole, the documents are both a practical tool for change and an important statement of the contribution the Turing will make to the wider STEM (science, technology, engineering and mathematics) landscape in this area. Following the launch of the strategy, the Turing published its first EDI annual report in February 2022. As well as providing an overview of our EDI activity, the report details the current diversity of the Turing community (with respect to age, gender, ethnicity and disability status). Improving our data collection and continuing to publish transparent data of who is represented at the Turing will remain a priority over the next year. In January 2022, the Turing responded to the Diversity in STEM inquiry that is being conducted by the House of Commons Science and Technology Committee to investigate the extent of underrepresentation in the STEM workforce, and how policy makers, funding bodies, industry and academia can address it. The Turing’s response to the inquiry summarised the current state of diversity in data science and AI, drawing on our own experiences and research. Our role as a research institute The Turing has continued to play a leading role in conducting research that contributes to creating a more equitable and inclusive society. Highlights featured in this report include the continued ethical leadership of The Turing Way (page 20), our research to improve lives in Bradford’s most deprived areas (page 16), and our work towards a safer internet (page 18). While far from an exhaustive list, other EDI-centred research projects at the Turing include ‘Women in data science and AI’ led by Judy Wajcman and Erin Young, and ‘Data science for tackling modern slavery’ led by Anjali Mazumder. As demonstrated throughout this report, fairness, bias and transparency remain key areas of work at the Turing, with projects including ‘Building an ethical framework for data science and AI in the criminal justice system’ and PATH-AI – a UK-Japan collaboration that is exploring how different cultural interpretations of values such as privacy, trust and agency can be applied to new, more inclusive governance frameworks around AI and other data-intensive technologies. PATH-AI this year published its first interim report. The Turing has also continued its focus on research supporting health and wellbeing, through projects such as the Turing-RSS Health Data Lab (page 21) and our NIHR-funded Research Support Facility (page 25). Elsewhere, the ‘Ground truth for mental health data science’ project is linking anonymised social media data in long-term birth cohort datasets to help improve the next generation of algorithms for mental health and wellbeing research. We remain committed to ensuring that the Turing supports a diverse group of researchers, and we are currently in the process of appointing our first two Daphne Jackson Fellowships, designed to support people returning to academia after a career break. Our role as an employer The Turing’s Network Groups are at the heart of our EDI programme, bringing together passionate members of the Turing community around four key themes. The Disability and Wellbeing Network Group has focused on producing a ‘work wellbeing plan’ that provides guidance for how members of the Turing community can talk to their colleagues or manager about their wellbeing and make positive changes. The group also marked Mental Health Awareness Week with a range of activities, including a photography competition and a facilitated wellbeing session. The Race Equality Network Group marked Black History Month with community events and personal testimonials on the importance of recognising Black identities, plus the release of two special podcasts with Nicol Turner Lee (director of the Center for Technology Innovation at the Brookings Institution) and Nira Chamberlain (leading British mathematician and former president of the Institute of Mathematics and its Applications). The LGBTQ+ Network Group has been connecting with other networks across the country, including a tea with Oxford Brookes University’s LGBTQ+ Staff Forum and the Bishopsgate Institute, which holds extensive LGBTQ+ archives. The group also marked Pride Month with a range of events, including a session with the LGBTQ+ helpline Switchboard, a Q&A with Stonewall and a session reflecting on the legacy of Alan Turing himself (see page 42). On International Women’s Day (8 March), we launched the Gender Equality Network Group, which was previously combined with the LGBTQ+ group. The group has already made connections with its counterpart network in the British Library and has begun planning future activity. Outside of these groups, a new collaboration between the Turing’s EDI team and the research engineering team is exploring how we can use our in-house data science expertise to securely collate and analyse our employees’ diversity data. We have also implemented a new, year-round EDI learning and development programme that aims to equip the Turing community with awareness, knowledge and understanding of issues around inequality and discrimination, and the skills and tools to challenge these when they arise. Finally, the Turing is continuing to ensure that our community is supported through a new reasonable adjustments policy, as well as our new Report + Support platform, which allows employees to directly report any unacceptable behaviour. Section 1.6 Growing our network 431 Turing Fellows recruited via a new open call 5 Turing AI World-Leading Researcher Fellows appointed 40 interest groups, including 5 new ones this year 12 Data Study Group challenges 122 Data Study Group participants 3,070 Pieces of global media coverage mentioning the Turing Pallab Ghosh (L) interviewing the Turing’s Zoe Kourtzi (R) on BBC Breakfast about her team’s AI system, which promises to diagnose dementia after a single brain scan Top media coverage Artificial intelligence may diagnose dementia in a day (BBC News) Instagram followers up 68% Average no. of impressions per month 4.6k Facebook followers up 12% Average reactions per month 56 Twitter followers up 19% Average impressions per month 466k LinkedIn followers up 51% Average engagement rate per month 3.7% YouTube subscribers up 32% Average views per month 21k Instagram up followers 68% Average no. of impressions per month 4.6k Facebook followers up 12% Average reactions per month 56 Twitter followers up 19% Average impressions per month 466k LinkedIn followers up 51% Average engagement rate per month 3.7% YouTube subscribers up 32% Average views per month 21k   Section 1.7 Working with policy makers The Turing’s public policy programme works alongside governments, regulators and international organisations to explore not only how data science and AI can improve policy-making, but also how these technologies should be governed and regulated. With over 45 Turing-based researchers, 50-plus research projects and collaborators from more than 20 universities, the programme has gained national and international recognition. Over 100 public sector organisations have reached out to the programme since it was launched in May 2018, including government departments, regulators, non-ministerial departments, agencies and public bodies, local authorities, police forces and international organisations. Camden’s Data Charter In January 2022, Camden Council made history by publishing the UK’s first resident-informed Data Charter, which sets out how the council will ethically collect, process and share data in the borough. A key part of the development of the Data Charter was a participatory panel with 20 representative Camden residents, which took place across three weekends in autumn 2021. This gave residents the opportunity to find out how the council currently uses their data, and to discuss the ethical, social and legal challenges associated with data sharing. The Turing’s public policy programme worked closely with Camden Council and the public participation charity Involve throughout the year leading up to the panel to plan supportive educational material and activities, and also facilitated talks and sessions during the panel itself. A guidebook was also designed and produced by Christopher Burr (Ethics Fellow at the Turing) to help the participants understand the complex issues surrounding data use by local government. The panel discussions fed directly into the development of the Charter, which sets out a vision for how the borough can use data for the public good while building trust and protecting individuals’ rights and privacy. At the Turing, we hope to use this example to support more councils in using data in an ethical and responsible manner. The AI Standards Hub The Turing has been selected by the UK government to lead the pilot of a new AI Standards Hub, supported by the British Standards Institution and the National Physical Laboratory. The Hub, which is a commitment set out in the National AI Strategy, aims to foster engagement with the rapidly evolving field of AI standardisation and increase the UK’s contribution to the development of international standards for AI. These standards will play a key role in ensuring that organisations around the world have a common basis for governing AI technologies, which will in turn help to advance the development and adoption of safe and trustworthy AI systems. The AI Standards Hub will: bring together information about AI standards in an accessible and inclusive way; coordinate engagement among the UK’s AI community through workshops, events and a new online platform; and create tools and guidance to help stakeholders across industry, government, civil society and academia contribute to the shaping and use of AI standards. The Turing is thrilled to be the home of this important new initiative, which will see us working closely with government. We will be formally launching the Hub later in 2022. UNESCO’s General Conference At the UNESCO General Conference in Paris in November 2021, David Leslie, the Turing’s Director of Ethics and Responsible Innovation Research, gave a presentation to mark the launch of the first-ever global agreement on the ethics of AI. The ‘Recommendation on the ethics of artificial intelligence’, adopted by UNESCO’s 193 member states, was the culmination of two years of consultations. David said that the guidance “puts forward values and principles – such as human dignity, social and economic justice, environmental flourishing, and the interconnectedness of all living creatures with each other and the biosphere – that provide a compass for a global AI innovation ecosystem in dire need of directions for principled navigation”. The guidance sets out a series of policy recommendations that provide a framework for stakeholders across public, private and third sectors to put these values and principles into practice. Advancing the debate on AI in finance A new report from the public policy programme explores the use of AI in the financial services sector. Commissioned by the Financial Conduct Authority, the report maps out the potential benefits and harms associated with the use of AI in financial services, in areas including consumer protection, financial crime, the stability of firms and markets, and cybersecurity. Crucially, the report, which was launched at the CogX Festival in June 2021, also examines the fundamental role of AI transparency – the availability of information about an AI system’s decision-making – in pursuing responsible innovation. The report’s authors, Florian Ostmann and Cosmina Dorobantu, hope that this work will help stakeholders in the sector to navigate the evolving AI landscape in pursuit of socially beneficial technologies. Supporting the Council of Europe In response to the growing ethical risks posed by AI technologies, the Council of Europe (CoE) created the Ad hoc Committee on Artificial Intelligence (CAHAI) in 2019. CAHAI was tasked with examining “the feasibility and potential elements of a legal framework for the development, design and application of AI, based on CoE standards in the field of human rights, democracy and the rule of law”. Following the appointment of the Turing’s David Leslie to CAHAI’s nine-member Bureau in 2020, the Turing’s Ethics Theme in June 2021 published a non-technical primer on ‘AI, human rights, democracy, and the rule of law’. The primer is aimed at non-specialists, and introduces the main concepts and principles in CAHAI’s Feasibility Study (which explores how the fundamental rights and freedoms that are already codified in international human rights law can be the basis for a legal framework for AI). In autumn 2021, the Ethics Theme further published ‘Human rights, democracy, and the rule of law assurance framework for AI systems’, which is now supporting the CoE as it enters the next phase of its work on this topic. Leading the way on data justice The public policy programme has established a close relationship this year with the Global Partnership on Artificial Intelligence (GPAI), leading the GPAI project ‘Advancing research and practice on data justice’. Data justice is an emerging field that looks at data issues from a social justice perspective, seeking to understand how, for example, historically rooted discrimination, exclusion and exploitation can lead to inequalities in the way data is collected and used. The project’s main outputs have been an integrated literature review, an annotated bibliography and table of organisations, a repository of data justice case studies, and three preliminary guides that introduce the concepts and practicalities of data justice to policy makers, tech developers and impacted communities. This was a truly international endeavour: the team partnered with 12 organisations from Africa, the Americas, Asia, and Australasia, selected for their advocacy and activist work around technology and data rights. These organisations conducted their own research with local communities, providing essential insights throughout the project, and also worked with the Turing team to produce a four-part documentary for non-specialists about data justice (the first episode is available to watch). “It was important to draw on the expertise and lived experiences of project partners from around the world, putting underrepresented voices at the heart of discussions about data rights, justice and governance.” Morgan Briggs Research Associate for Data Science and Ethics The Alan Turing Institute   Section 1.8 Public engagement Section 1.8 Public engagement 3D printed bridge opens to the public The Turing now has a presence in the heart of Amsterdam, thanks to the world’s first 3D printed steel bridge, which opened to the public in July 2021. As well as demonstrating the potential of this innovative construction process, the bridge is also equipped with about 100 sensors that measure the bridge’s load and how the structure vibrates and subtly bends and tilts as people cross it. The Turing teamed up with the bridge’s builders MX3D and US software company Autodesk to develop a digital twin for the bridge – a computer model that will use the bridge’s sensor data to monitor the structure in real time. The digital twin could be used by engineers to analyse how the steel is behaving, providing early information about any maintenance needs, as well as important insights into how 3D printed steel might be used for more complex building projects. The bridge’s launch was covered extensively in the popular press, including New Scientist, BBC Newsround and MailOnline. Data science and AI glossary In September 2021, we launched our jargon-free data science and AI glossary – a resource aimed at non-specialists who want to find out more about these topics without having to navigate the technical language. We hope to: lead the conversation around these topics and counter misinformation; provide clarity to the terms that people hear in everyday life (e.g. ‘algorithm’, ‘deepfake’, ‘robot’); and introduce people to new concepts (e.g. ‘neural network’, ‘synthetic data’, ‘digital twin’). We are also hoping that it will be a useful resource for journalists and policy makers, as well as researchers in areas that intersect with data science and AI. The world of data science and AI is ever-changing, so we will be regularly reviewing the glossary and adding new terms to the list. In the media Turing researchers and their projects have continued to appear across TV, radio, podcast and news media this year. It was an especially busy year for Andrea Baronchelli, the Turing’s Token Economy Theme Lead, whose work on mapping out the NFT marketplace led to numerous media appearances, including BBC News, Wired, BBC Radio 4’s Money Box, Vice and NBC News. This followed a blog he wrote for our website – ‘Non-fungible tokens: can we predict the price they’ll sell for?’ – which was our most viewed of the year. Another highlight was the story of an AI system that promises to diagnose dementia after a single brain scan, developed by a team led by Zoe Kourtzi, Turing University Lead at the University of Cambridge. The research, which could allow patients to receive a diagnosis and preventative treatment years before developing symptoms, received extensive coverage via outlets including BBC News, BBC Breakfast TV, The Times, MSN and US public radio. Finally, our new Chief Scientist, Mark Girolami, kick-started his role with a series of high-profile media appearances, including Engineering & Technology magazine, Forbes and Research Professional. The Turing at COP26 In autumn 2021, the Turing attended COP26 – the UN climate change conference in Glasgow. Our researchers at the conference included Gavin Shaddick (Turing Fellow and Chair of Data Science & Statistics at the University of Exeter) who presented his work exploring the intersections between AI and climate, on projects including ‘Impacts of climate change and heat on health’, and the ‘Climate impacts, mitigation, adaption and resilience’ (CLIMAR) framework. To coincide with the opening of the conference, Gavin also wrote a public-facing blog for us on the crucial role for AI in tackling climate change. The Turing was also represented at COP26 by Heather Selley, a Turing Enrichment student at the University of Leeds whose PhD research involves analysing satellite imagery to monitor changes in Antarctic ice. On the eve of the event, she made national news when she named nine fast-flowing glaciers in West Antarctica – including ‘Glasgow Glacier’. Heather presented her research to COP26 delegates in the Cryosphere Pavilion, and also appeared at the Space4Climate stand in the Green Zone to talk with the public about the importance of satellite data in measuring the impacts of climate change. Hosting a Reith Lecture Every year since 1948, BBC radio has broadcast the Reith Lectures, giving a national platform to leading figures of the day. This year’s lecturer was Stuart Russell, an AI expert who is professor of computer science at the University of California, Berkeley. We were delighted at the Turing to host the first of Stuart’s four lectures, which were listened to by over a million people. Broadcast on BBC Radio 4 and the BBC World Service in December 2021, his lecture at the Turing traced the story of AI back to Aristotle and looked towards the future, arguing that the arrival of AI is the biggest event in human history. Listen to the lecture on BBC Sounds. Exploring young people’s attitudes to AI In February 2022, the Turing worked with the British Science Association’s Future Forums initiative to gather the opinions of 14- to 18-year-olds on AI. This involved a survey of 2,000 young people, plus four in-person workshops, each with two Turing researchers and ten participants. The aim of this project was to give young people in the UK an opportunity to air their views on how AI is used in society, and which AI applications they would like to see investment in (mental health and climate change were the most popular options). In turn, we were able to directly connect with this underrepresented group, giving us insights into the AI-related issues that matter most to young people. About one-quarter of workshop participants said that they were previously unaware of AI’s presence in their daily lives, which demonstrates the need for further educational outreach in this space. Exploring Alan Turing’s legacy On 23 June 2021, on the 109th anniversary of Alan Turing’s birth, the UK’s new £50 banknote was launched, featuring a portrait of Turing alongside his quote about early computers: “This is only a foretaste of what is to come and only the shadow of what is going to be”. In the run-up to the occasion, we held a free virtual, public event: ‘Breaking the code: Alan Turing’s legacy in 2021’. In this engaging panel discussion, guests including Sir Dermot Turing (author and nephew of Alan Turing), Sue Black (computer scientist and organiser of the Saving Bletchley Park campaign), Clara Barker (material scientist and LGBTQ+ campaigner) and Shakir Mohamed (research scientist at DeepMind) explained why Alan Turing means so much to them, and how his life and work can inspire positive changes in STEM today. Watch the event.   Section 1.9 Convening academia, industry and policy makers AI UK 2022 The Turing’s flagship AI UK event returned in March 2022 for a two-day showcase of the best of UK research in AI and data science. More than 2,100 attendees from 52 countries joined us, from across academia, industry and the public sector. After last year’s virtual event, we moved to a hybrid format for 2022: the majority of our 200+ speakers were filmed in person in London, while attendees logged on via our online platform to watch live. With 48 sessions and 11 workshops across four stages, AI UK 2022 covered a huge array of topics, including biodiversity, smart cities, AI imagery, labour exploitation and data governance, to give just a flavour. Uniting this diversity of topics was a clear overarching theme: how do we maximise the benefits of AI while limiting its potential harms? Stellar speakers at the event included Sir Patrick Vallance, Dame Stephanie Shirley and Jeanette Winterson. These sessions and more can now be watched on our YouTube channel. Planning is well underway for AI UK 2023. COVID-19 report A major report published by the Turing in June 2021 reflected on the response of the UK’s data science and AI community to the pandemic. ‘Data science and AI in the age of COVID-19’ summarises insights from a series of workshops held in late 2020, which involved about 100 experts from backgrounds including ethics, mathematics and medicine. The report found that the data science and AI community stepped up to work alongside clinicians, policy makers and government at the heart of the response. But there were also substantial challenges that prevented the community from realising its full potential. The report highlights a need for: increased data availability, access and standardisation; greater representation of minority groups, both within datasets and within the research community; and clearer and more accurate communication of research findings to policy makers and the public. The report presents suggestions from the workshop participants for how the community might address these challenges and be better placed to respond to the next public health crisis. Turing Network Development Awards A new initiative from the Turing this year saw 25 UK universities receive funding to help build new collaborations across the data science and AI sector. These Turing Network Development Awards (TNDAs) gave grants of up to £25,000 to institutions from Scotland, Wales, Northern Ireland and England. The aims of these awards include: growing the data science and AI research and innovation community across UK universities; raising awareness of the Turing’s opportunities and initiatives; identifying and exploring complementary areas that could form the foundation of future collaboration; and supporting knowledge exchange across the Turing network. Our future vision at the Turing includes working with more universities and researchers from across the UK to drive impactful research in data science and AI. The TNDAs are a starting point for this, and we expect collaboration and strategic alliances to develop beyond the lifetime of these awards. Data Study Groups After a reduced schedule in 2020, our Data Study Groups (DSGs) were back in full force in 2021 with events in April, September and November. These collaborative ‘hackathons’ allow organisations from industry, government and the third sector to pose real-world data science challenges to talented teams of carefully selected researchers. Running as virtual events since the pandemic, DSGs take place over three weeks, with a week of part-time, preparatory workshops and presentations, followed by two weeks of full-time group work. In total, 122 participants tackled 12 challenges in 2021. These included a challenge with Cefas to classify plankton species using machine learning (also see page 14), and three challenges with the UK Dementia Research Institute that used AI to tackle research problems in this area. The DSG team was also thrilled this year to win a KE Award for Academic Engagement of the Year. The KE Awards, organised by PraxisAuril – the UK’s professional association for knowledge exchange practitioners – recognise the people and initiatives that have helped to initiate and deliver impact from publicly funded research. Supporting the National AI Strategy A significant development within the UK’s AI ecosystem this year was the launch of the government’s National AI Strategy, which sets out a 10-year plan for enhancing the nation’s reputation in this domain by boosting businesses’ use of AI, attracting international investment, and developing the next generation of tech talent. The Turing supported the development of the Strategy by conducting a survey of the UK’s AI ecosystem in June 2021, working alongside the AI Council. The survey gathered the views of over 400 people who are researching, developing, working with, or using AI technologies. The results, summarised in our report, revealed areas that the UK needs to prioritise to realise the full potential of AI, including recruiting and retaining top AI talent, translating research into commercial products, and growing technical and ethical awareness at the leadership level. We are committed to playing our role in helping to foster a sustainable, inclusive and multidisciplinary UK AI ecosystem. Interest groups Our interest groups continue to bring researchers together around shared passions, across the UK and beyond. We now have 40 interest groups at the Turing, totalling over 3,500 members. Five groups have been founded this year, including ‘Clinical AI’, ‘Synthetic data’ and ‘Bridging machine learning and behaviour models’. The groups act as forums for sharing knowledge, with the aim of sparking new ideas for collaborations and projects. Highlights this year include the setting up of the ‘Turing trustworthy AI forum’ interest group, which allows researchers and business leaders to discuss the latest thinking in trustworthy AI, with the goal of helping organisations to deliver safe and ethical AI-driven systems. Another success story is the ‘AI & arts’ interest group, which now connects over 70 members from universities, museums and arts institutes around the UK. One of its members, Turing Fellow Drew Hemment, this year launched the Turing-funded ‘The New Real Observatory’ – an interactive, AI-powered tool that artists can use to create imagery by manipulating environmental data. Our first data-centric engineering summit In September 2021, we held our inaugural DCEng Summit – a two-day virtual event that explored how data science tools and methods can improve the reliability, resilience, safety, efficiency and usability of engineered systems. Over 350 attendees watched 70 international speakers across 25 thought-provoking sessions that enhanced the reputation of this innovative and emerging field.   Section 1.10 Skills and training Training new generations of data science and AI leaders is one of the Turing’s three goals. We do this through the delivery of exciting programmes and courses, often in partnership with other experts in the field. Our evolving Enrichment scheme Our Enrichment scheme placements returned to an in-person format this year, providing opportunities, support and training for 76 PhD students from 22 academic institutions. We expanded the scheme to offer placements at the University of Bristol’s Jean Golding Institute and the Leeds Institute for Data Analytics, as well as the Turing offices in London. Recognising the value of remote training and networking, we have also spent this year developing a new Enrichment Community Award, which will be rolled out for the 2022-23 scheme. This virtual option will provide access to the Turing’s growing online community, as well as small grants to enrich students’ research through activities including collaborative visits and conference presentations. This year, we also piloted a Post-Doctoral Enrichment Award scheme to allow ongoing online engagement among post-doctoral researchers with the Turing’s community. In the first round, we offered 73 awards (which include a small grant) to researchers across 29 different institutions. Turing Internship Network and Data Study Groups: a new chapter Our Turing Internship Network (TIN) and Data Study Groups (DSGs) this year came together to launch a pilot programme to support charities and NGOs working in environment and sustainability. We are helping to place PhD data science researchers within partner organisations to work on their data challenges over a six-month period. Successful interns will also help to prepare a DSG, which will be held towards the end of the internship. This will see a team of data scientists further exploring the partner organisation’s challenges and coming up with a range of potential solutions or routes for further investigation. This pilot will allow us to use the TIN to better prepare data challenges coming through the DSG pipeline, helping us to understand how these two skills-sharing programmes can be leveraged together for a greater impact. Ten environmental organisations have now been selected for the pilot, covering areas including biodiversity, pollution monitoring and wildlife protection. The internships will begin in autumn 2022. Online learning and training At the Turing, we offer a range of online courses to people from around the world. Following a competitive application process, over 300 learners this year took part in six online training opportunities, including our flagship courses in research software engineering and research data science. The pandemic has helped us to open up our training resources to anyone with the right prerequisite knowledge, providing us with a great platform to start building an international and diverse learner community. We have also launched activity to promote better teaching practice in data science and AI, including a ‘data science education’ interest group which has rapidly grown to over 250 members. And we are currently running the inaugural data science and AI educators’ programme, which will allow us to share our developing expertise across the UK and beyond, and enable experts to better teach these essential topics. Also this year, we created Turing Commons – an online platform with educational courses, tools and other resources designed to help researchers from a wide variety of backgrounds reflect on and discuss the social impacts of data science and AI. The first two courses – ‘responsible research and innovation’ and ‘public engagement of data science and AI’ – are now online. Additional courses and resources are scheduled for 2022, alongside plans to engage new partners and groups. National skills leadership As a national institute, we are increasingly called upon to offer leadership in developing data science and AI skills. We have continued to work with the Alliance for Data Science Professionals to define the first set of professional standards for data science. This is a key step in helping to define and address the data skills gap that exists in the UK. We have also been working with DCMS on the development of a Data Skills Indicator – a methodology for data science and AI labour market analysis that will be used by the National Data Strategy team for longitudinal evaluation of the UK skills landscape. Finally, we hosted a ministerial launch for the Data Skills Taskforce’s Data Skills Portal in November 2021, attended by Julia Lopez, Minister for Media, Data and Digital Infrastructure. The portal was developed with support from the UK government, and it provides a self-assessment tool to help businesses understand their organisational and technical readiness with respect to data. It is now used by more than 700 businesses per month.   Section 2 Trustees’ and strategic report The financial statements comply with the Charities Act 2022, the Companies Act 2006, and the Statement of Recommended Practice (SORP) applicable to charities preparing their accounts in accordance with the Financial Reporting Standard applicable in the UK (FRS102) effective 1 January 2019 (Charity SORP 2nd Edition). The Charity is registered and is a company limited by guarantee governed by its Articles of Association dated March 2015 and a Joint Venture Agreement with the Founder Members dated March 2015. Company Number: 09512457 Charity Number: 1162533   Directors/Trustees Directors/Trustees The Directors of the charitable company (the “Charity”), as registered at Companies House, are its Trustees for the purposes of charitable law and throughout this report are collectively referred to as the Trustees. The Trustees serving during the year and since the year end were as follows: Howard Covington Chair – resigned 18 March 2022* Nicola Blackwood-Bate Frank Kelly Richard Kenway Kerry Kirwan Vanessa Lawrence CB Thomas Melham Carina Namih Hitesh Thakrar Neil Viner Patrick Wolfe *The new Chair of the Board of Trustees, Douglas Gurr, is joining the Institute on 1 July 2022. Key management as at 31 March 2022 Executive Team Adrian Smith Institute Director and Chief Executive Jonathan Atkins Chief Operating Officer Christine Foster Chief Collaboration Officer Mark Girolami Chief Scientist Senior Management Team Donna Brown Director of Academic Engagement Ian Carter Director of IT and Information Security Allaine Cerwonka Director of Academic Research Programmes Vanessa Forster General Counsel and Company Secretary Nicolas Guernion Director of Partnerships Catherine Lawrence Director of Programme Management Martin O’Reilly Director of Research Engineering Clare Randall Director of People Kathryn Stillman Interim Director of Communications and Engagement  Programme Directors as at 31 March 2022 George Balston Defence and security Mark Birkin Urban analytics Tim Dodwell Data-centric engineering Cosmina Dorobantu Public policy Chris Holmes Health and medical sciences Ben MacArthur AI for science and government Helen Margetts Public policy Jonathan Rowe Data science for science and humanities AI for science and government (ASG) Lukasz Szpruch Finance and economics Sethu Vijayakumar Artificial intelligence Tim Watson Defence and security Adrian Weller Artificial intelligence Michael Wooldridge Artificial intelligence Kirstie Whitaker Tools, practices and systems Registered Office The British Library 96 Euston Road London, NW1 2DB Auditors Moore Kingston Smith LLP Chartered Accountants 6th Floor 9 Appold Street London, EC2A 2AP Bankers Barclays Bank UK PLC Leicester Leicestershire, LE87 2BB Solicitors Bates Wells 10 Queen Street Place London, EC4R 1BE Mills & Reeve 100 Hills Road Cambridge, CB2 1PH Structure, governance and management Our legal structure The Alan Turing Institute was founded in March 2015 as a registered Charity (1162533) and a company limited by guarantee (09512457). The Institute is governed by its Articles of Association that were adopted on incorporation on 26 March 2015 and a Joint Venture Agreement with the Founder Members signed on 31 March 2015 (together the “Constitutional Documents”). The Constitutional Documents set out the governance of the Institute as the responsibility of the Board of Trustees with some reserved matters to the Founder Members. Purpose of the Institute and main activities As the national institute for data science and artificial intelligence, the charitable objects of the Institute, as set out in its Articles of Association, are the furtherance of education for the public benefit particularly through research, knowledge exchange, and public engagement, in the fields of data sciences. In 2017, as a result of a government recommendation, the Institute added artificial intelligence to its remit. The Institute has power to do anything which furthers its charitable objects. In particular, the Institute’s ambitions are to: Advance world-class research in data science and AI, and apply it to real-world problems. Train the data science and AI leaders of the future. Lead the public conversation on data science and AI. The Trustees confirm that they have paid due regard to the Public Benefit Guidance published by the Charity Commission, including the guidance “Public benefit: running a charity (PB2)”, in undertaking their activities. Related parties The Institute’s Founder Members are EPSRC and the University of Cambridge, University of Edinburgh, University of Oxford, University College London (UCL) and University of Warwick. The Founder Members have entered into a Joint Venture Agreement which establishes, along with the Articles, the basis on which the Institute operates. On 1 April 2018, the Institute entered into five-year partnership arrangements with eight additional universities: Birmingham, Bristol, Exeter, Leeds, Manchester, Newcastle, Queen Mary University of London, and Southampton. The Institute has a wholly owned subsidiary, Turing Innovations Limited (company registration number 10015591). Turing Innovations Limited has a minority shareholding in Quaisr Limited, a private limited company (company registration number 12704209). Board composition and responsibilities Board composition and responsibilities The Institute is governed by its Board of Trustees, currently six of whom have been appointed by the six Founder Members and four of whom are Independent Trustees. The Chair stood down on 18 March 2022 with the new Chair, Douglas Gurr, joining the Institute on 1 July 2022. The Board of Trustees has been established in accordance with the terms of the Joint Venture Agreement. The Board composition is determined as follows: Each Founder Member may appoint one Trustee. Founder Members may, by a unanimous decision, select and appoint an Independent Trustee who acts as Chair of the Board and may from time to time remove such Independent Trustee by a majority decision. The appointed Trustees may appoint further Independent Trustees such that, so far as possible, the total number of Trustees on the Board will be an odd number. The Trustees appointed by the Founder Members must always form a majority of the Board and may from time to time remove and replace Independent Trustees. Organisational management and responsibilities of the Board The Institute’s Board of Trustees is responsible for setting the aims and strategic direction of the Institute, approving key policies, monitoring risks, approving the annual budget and expenditure targets, and monitoring actual and forecast financial results. Trustees meet formally as a Board up to five times a year. In addition, Trustees normally attend at least two strategy days with the Executive and Senior Management Team and undertake further meetings as and when needed. The Executive Team, Senior Management Team and relevant Programme Directors provide Trustees with regular reports on the Institute’s financial position, current activity, organisational updates, and significant issues affecting the Institute. The Executive Team, led by the Institute Director and Chief Executive, and supported by the Senior Management Team and the Programme Directors, is responsible for the day-to-day management of the Institute’s operations and activities. The Executive Team, Senior Management Team and the Programme Directors are also responsible for implementing strategy and corporate policies and reporting on performance to the Board. Committees of the Board The Board is supported by three formal Committees. Each Committee has processes in place for managing any conflicts of interest that arise. Audit and Risk Committee This Committee is responsible for audit, finance and risk management as well as reviewing the effectiveness of the Institute’s internal control framework and risk management process, and compliance with reporting requirements and reports to the main Board on the same. It monitors the work of the external auditors and receives and reviews audit reports. It monitors the full external audit process and resulting financial statements, including overseeing the terms of appointment of the external auditors. Membership: Neil Viner Chair Hitesh Thakrar Patrick Wolfe Stephane Maikovsky (Independent member/non-Trustee) Nomination Committee This Committee is responsible for all aspects of the appointment of new non-Founder Member Trustees to the Board of Trustees, succession planning and the reviewing and monitoring of governance processes. It also has responsibility for monitoring boardroom diversity and recommending appointments within the Audit and Risk Committee and the Remuneration, EDI and People Committee in consultation with the Chairs of those Committees. In Autumn 2021, the Founder Members delegated the search for the new Chair of the Institute to the Nomination and Governance Committee. Membership: Vanessa Lawrence CB Chair – appointed as Chair 22 June 2021 Nicola Blackwood-Bate Richard Kenway Thomas Melham Remuneration, EDI and People Committee This Committee advises the Board of Trustees and oversees the preparation of policies and procedures in respect of salaries, emoluments, and conditions of service. In line with these approved policies and procedures, the Committee approves the total remuneration package for the Chair of the Institute, the Institute Director and those senior staff reporting directly to the Institute Director. The criterion for setting pay is the market rate, taking into account industry standards. This Committee also has oversight of equality, diversity, and inclusion (EDI) which has included review and challenge of the EDI strategy and action plan. Membership: Hitesh Thakrar Chair – appointed as member and Chair on 22 June 2021 Frank Kelly Richard Kenway Carina Namih Advisory boards Other advisory groups are set out below. The Joint Venture Agreement includes the following two advisory groups: Research and Innovation Advisory Committee This group advises the Institute Director and Chief Executive on strategic aspects of the Institute’s research and innovation activities, including support with research and training programmes. Scientific Advisory Board This is an independent group designed to be made up of international experts in academia, industry, and government which provides strategic advice to the Institute’s Board of Trustees and the Executive Team on the development and implementation of the scientific research strategy. During the year, the Board of Trustees agreed to stand down the membership of this board with a view to refreshing it in 2022-23 with an open call for new membership. Other advisory and liaison groups: Strategic Partners Board This group is intended to advise the Institute Director and Chief Executive on the content and translation of research generated at the Institute and is intended to collaborate across the Institute and its partners to identify new opportunities. University Partners Board This group is intended to advise the Institute Director and Chief Executive on the research direction of the Institute, the Institute’s relationship with its university partners, and the higher education landscape as it relates to data science and AI. Recruitment and appointment of Trustees The Nomination and Governance Committee aims to undertake an open recruitment process, recommends new candidates for appointment when necessary and ensures appropriate recruitment and succession plans are in place for independent appointed Trustees i.e. not Founder Member-appointed Trustees. During the year, an open and comprehensive recruitment process was undertaken to identify and appoint a new Chair of the Board of Trustees. This has now been successfully concluded with the new Chair of the Board of Trustees, Douglas Gurr, joining on 1 July 2022. There were no further Trustee appointments made during the 2021-22 financial year. On appointment, every Trustee completes a declaration of interests form which is held within a register of interests and which is monitored and updated on a regular basis and reviewed annually. Trustee-related party transactions are disclosed in greater detail within the financial statements later in this report. All conflicts are actively managed through early identification of potential areas of conflict and appropriate action taken where necessary. Each new Trustee undergoes a tailored induction programme which includes a programme of meetings with the members of the Executive Team and relevant members of the Senior Management Team and other Trustees. New Trustees are provided with a Trustee Information Pack which includes initial information about the Institute and its work, a copy of the previous year’s Annual Report and Accounts, a copy of the Institute’s Articles of Association, a copy of the Joint Venture Agreement, information about their powers as Trustees of the Institute, key corporate policies, and a copy of the Charity Commission’s guidance, “The essential trustee: what you need to know”. Financial review The Institute is funded through grants from Research Councils, Founder Members, University Partners and from strategic and other partnerships. Income of £51.1m (2020/21: £37.3m) has been received during the year of which £35m was received from Research Councils (2020/21: £19m), £2.3m from Founding Members and University Partners (2020/21: £6.5m), £10m from strategic and other research partnerships (2020/21: £7.7m), £3.4m from other trading activities and investment income (2020/21: £3.2m) and £0.4m (2020/21: £0.8m) from donations. Of the £35m received from Research Councils, £10m was awarded from EPSRC to enable the Institute to continue its evolution. Expenditure of £39.3m (restated 2020/21: £32.4m) has been incurred in the year. Grants payable to Founding Members and other University Partners represent 40% of total expenditure (restated 2020/21: 41%). Staff costs represent 44% (restated 2020/21: 41%) of total expenditure, increasing from £13.3m in 2020/21 to £17.2m in 2021/22, as the Institute expands its research programme delivery and its operational areas to support delivery. The remaining 16% (2020/21: 18%) of expenditure covers support costs and other direct costs. The Group made a surplus of £11.8m (restated 2020/21: £4.9m). This has been transferred to reserves and will be used to fund research and Group costs during 2022/23 and beyond. Group net assets at 31 March 2022 are £41.1m (restated 2021: £29.3m). Fixed asset values increased by £0.8m. During the year, £1.1m was spent on computer equipment and £0.5m was spent on the refurbishment of the Institute’s first floor office space. The additions were offset by £0.7m of depreciation and amortisation charges and net equipment disposals. Current assets: Debtors are £11m (2021: £7.9m) and represents an increase in trade debtors and project accrued income from the previous year, driven by higher levels of project activity. Cash, including current asset investments, has increased by £2m to £50.5m (2021: £48.5m). This is largely due to the upfront nature of cash receipts on many of the Institute’s grant awards. A current asset investment of £5m was made in 2020/21 and represents a cash deposit held in an interest bearing 95-day notice account with Barclays. Creditors: amounts falling due within one year are £22.1m (restated 2021: £26.7m). Grant creditors were £2m higher than last year, driven by increased levels of project activity and a reclassification of grant expenditure due from Creditors: amounts falling due in more than one year. Accruals and deferred income were £3.2m lower than last year due to a decrease of £1.8m in accrued project expenditure and a £1.4m decrease of project funding received in the current year but deferred. Other creditors were £4.2m lower than in 2021, this being the amount due to UK Research and Innovation (UKRI) for the repayment of grant funding received in previous years for the AI for science and government programme (ASG). This matter was settled in September 2021. Creditors: amounts falling due in more than one year reduced by £1.1m as the majority of grant-funded programme commitments are now due within one year. Going concern The Trustees have assessed whether the use of going concern basis is appropriate and have considered possible events or conditions that might cast significant doubt on the ability of the charitable company to continue as a going concern. The Trustees have made this assessment for a period of at least one year from the date of the approval of these financial statements. In particular, the Trustees have considered the charitable company’s forecasts and projections and have taken account of pressures on income. After making enquiries, the Trustees have concluded that there is a reasonable expectation that the charitable company has adequate resources to continue in operational existence as set out below. The charitable company therefore continues to adopt the going concern basis in preparing its financial statements. Fundraising The Institute does not engage in fundraising activities with the general public. Costs of raising funds in the financial statements relate to sourcing of new institutional funders. The Institute does not use third parties to assist with fundraising and the Institute received no complaints in this year regarding its fundraising practices. Treasury Management Policy Treasury management activity monitors the timing and amounts of cash inflows and outflows, in particular monitoring and tracking those activities that result in significant cash movement. The Treasury Management Policy is confined to the management of short- to medium-term liquid funds (maximum investment term is 18 months). Assets are protected by investing with approved counterparties. Investments are risk-averse and non-speculative, and the Institute places no income reliance on interest earned. Grant-making policy The Institute’s grants will be subject to outputs being appropriately recorded and assessed. Data held will be in line with the grant guideline requirements issued by UKRI. Fundamental principles have been established and adopted by the Institute. These are as follows: The Institute will award grants that are in line with the charitable objects of the organisation. The Institute intends to assess grants biannually to ensure compliance with the terms of the grant. The Institute expects to assess the progress of each grant within three months of the end of the grant period. Reserves policy The Charity reviews its unrestricted reserves policy each year, taking account of its planned activities and the financial requirements for the forthcoming period. The Trustees believe that the Institute should have access to reserves appropriate to the scale, complexity and risk profile of the Institute. To cover any shortfall in grants and to maintain the financial viability of the Institute, reserves are currently set at the equivalent of a minimum 6 months of anticipated operating costs which amounts to approximately £10m as at 31 March 2022. In November 2019 an award of £10m was made by EPSRC to cover core operating costs for the years 2020/21 to 2021/22. Additionally, EPSRC meets the annual rental costs for leasehold space occupied by the Institute in The British Library’s St. Pancras building. Following a reconciliation of the initial five-year award made to the Institute by EPSRC in 2016, additional funding was also received. The table below shows the total funding received from EPSRC to support core operating costs. The Charity’s unrestricted Fund as at 31 March 2022 was £30.5m (2021: £21.1m). This includes £0.9m (2021: £0.7m) of funding held to cover future years’ financial commitments and funds designated by the Board for investment in future research. This includes £2m (2021: £nil) to support Turing 2.0 research projects; £1m to support the AI for science and government programme (2021: £nil) and £1.5m (2021: £1.7m) for the Institute’s safe and ethical AI programme. This leaves £25.1m of free reserves (2021: £18.7m). The amount in excess of that called for in our reserves policy (as stated above) will be made available for investment in future research. As at 31 March 2022, the Institute holds £10.2m (restated 2021: £8.2m) of restricted reserves. Remuneration policy The Institute is committed to ensuring a proper balance between paying staff (and others who work for the Institute) fairly to attract and retain the best people for the job with the careful financial management of our charity funds. The Remuneration, EDI and People Committee oversees the overall remuneration of staff and specifically that of the Institute Director and Chief Executive, and those senior managers reporting directly to the Institute Director and Chief Executive. The Remuneration, EDI and People Committee assumes the responsibilities of remuneration within the Institute, and oversees the preparation of policies and procedures in respect of salaries, emoluments, and conditions of service. Formal consideration of remuneration matters takes place annually, usually at the Committee’s March meeting. However, remuneration matters may also be considered at other meetings if ad hoc issues arise during the year. Depending on the policies of the Board, the Committee does not have full delegated authority to approve all matters relating to remuneration and any recommendation or decision requiring such approval must be ratified by the Board of Trustees. The Institute discloses all payments to Trustees and the number of staff with a total remuneration of £60,000 and above in accordance with the Charity Commission’s Statement of Recommended Practice 2019 (SORP). Streamlined Energy and Carbon Reporting (SECR) Annual energy usage and associated annual greenhouse gas emissions are reported pursuant to the Companies (Directors’ Report) and Limited Liability Partnerships (Energy and Carbon Report) Regulations 2018 that came into force on 1 April 2019. The energy use and associated greenhouse gas emissions reported in the adjacent table are for The British Library’s St. Pancras building, in which the Institute occupies 2,305 square metres of space out of a building total of 126,515 square metres. It is not possible to disaggregate our energy usage and emissions from that of the whole building. The annual reporting period is 1 April to 31 March each year and the energy and carbon emissions are aligned to this period. The electricity and gas consumption figures were compiled by The British Library from invoice records. Emissions per square metre floor area is reported to reflect the energy efficiency of the building. Reasonable adjustments policy During the year the Institute maintained its policy of giving full and fair consideration to applications for employment made by disabled people. The Institute is committed to continuing employment and training of employees who become disabled and to the training, career development and promotion of all employees.   Risk management Significant risks to which the Institute and Turing Innovations Limited are exposed are reported formally to the Audit and Risk Committee and the Board of Trustees via the Institute’s corporate risk register. The Institute has a formal risk management framework, reviewed by the Board during the year, which is embedded within the business that supports the identification and management of risk across the Institute. The Senior Management Team and the Programme Directors are responsible for managing and reporting risks in accordance with the Institute’s Risk Management Policy, while the Trustees retain overall responsibility for risk management. The risk management framework incorporates categories of risk which cover generic areas such as funding and growth, compliance and governance, security and controls, and brand and reputation. The Board of Trustees seeks to ensure that the risks are mitigated, so far as is reasonably possible, by actions taken by the Institute’s Executive Team, Senior Management Team, and the Programme Directors. The main risks faced by the Institute are captured on the corporate risk register, which is regularly reviewed by the Board and the Audit and Risk Committee. A summary of the key risks is included to the right. Risk description Risk mitigation Sources of funding for the Institute continued to be under review during the year. Prudent financial management of the Institute such that it can react to changes in external funding in an agile, controlled manner. Continue to review options for additional sources of funding. Failure to comply with legal and charity commission requirements such as data protection, serious incident reporting, National Security and Investment Act (NS&I) and export regulations. Continuing focus on improving the control environment during the year with policies and procedures updated and introduced. The pipeline of commercial opportunities reduces due to the post-pandemic economic environment. Deepening existing relationships and focusing effort on strategically important areas. Loss of, or inappropriate handling of, the Institute’s data. Implementing robust security processes, both physical and virtual. AI applications developed by or in partnership with the Institute being used for malicious or unintended purposes. Strengthening ethics processes, ensuring they remain fit for purpose and adequately resourced.   Section 172 Statement The Board of Trustees are aware of their duty under s.172 of the Companies Act 2006 to act in the way which they consider, in good faith, would be most likely to promote the success of the company for the benefit of its members as a whole. In this section you will find examples of how the Institute has considered our stakeholders when making decisions during the year. The Board has a duty to promote the success of the Institute for the benefit of the members, whilst also having due regard for the interests of our colleagues, and for the success of our relationships with suppliers and customers and for the impact of our activities on the wider community. The considerations of our stakeholder groups are integral to our decision-making. However, where decisions taken may adversely impact a particular stakeholder group, we will always endeavour to treat them fairly. 1. Members Board considerations All Board decisions are made with the success of the Institute for the long-term benefit of its members and stakeholders at the forefront of the minds of the Trustees. Annual General Meeting The 2020-21 Annual General Meeting was well attended by our members and a series of resolutions were proposed. These included the approval of Founder Member appointed representatives adopting six-year terms of office and the delegation to the Board of Trustees to undertake the recruitment and agree the terms for the appointment of the new Chair of the Board of Trustees. Annual report and accounts Whilst the Institute has a statutory obligation to provide certain information in the annual report, the information is presented in an engaging and understandable way. The Institute also looks to enhance its sharing of information throughout the year through the content made available on the Institute’s website. Founder Member approvals During the year, the member representatives were asked to approve certain matters which were reserved for them as covered in the constitutional documents. This included the formal delegation for the Board of Trustees to undertake the recruitment process for the new Chair of the Board of Trustees.   2. Colleagues Board considerations The Trustees receive regular qualitative and quantitative updates on employee matters from the People Director, who attends Board and Remuneration, EDI and People Committee meetings, including analysis received through employee engagement surveys, regular EDI updates and an annual update on the Performance Review and Performance Related Pay process. This provides the Board with oversight of the effects our people engagement has on our performance, and the continued strength of our culture. In addition, the committees also received various reports during the year relating to the wellbeing of our colleagues, such as the annual report on health and safety reviewed by the Audit and Risk Committee and the annual report on EDI reviewed by the Remuneration, EDI and People Committee. Town Hall Enhancing employee engagement is an integral part of the culture of the Institute. Senior management are actively involved in the engagement of colleagues through weekly electronic communications, monthly staff meetings and quarterly Town Hall meetings that involve employees and full-time members of the wider Turing community, to provide updates on business developments. Equality, diversity and inclusion (EDI) Considerable work has been undertaken during the year on the development and implementation for the first year of the EDI strategy and action plan, which focuses on EDI in the role of the Institute as a national body, a research institute and as an employer. Further information on the activities undertaken can be seen in section 1 (page 27). Hybrid working survey The Institute undertook a survey of the Turing community during the year to get an understanding of views on returning to office-based working post-pandemic. The results of this survey were considered by the Remuneration, EDI and People Committee, resulting in the commencement of a 6-month hybrid working pilot which is due to conclude in 2022, with recommendations on any changes to working practices to follow. Employee remuneration and recruitment The Remuneration, EDI and People Committee agreed to implement a cost-of-living increase for all employees of 5% to be effective from 1 April 2022.   3. Customers and suppliers Board considerations The Trustees recognise the existence of a number of key external stakeholders (general public, Founding Members, university partners, strategic and commercial partners, government agencies, public health bodies, charitable foundations, customers, and suppliers). The Trustees remain committed to effective engagement of all stakeholders and are mindful that the Institute’s success depends on its ability to engage effectively, work together constructively and to take stakeholder views into account when taking decisions. Shaping the way that research is undertaken Through providing visible national leadership on setting sectoral best practices – crowd-sourcing the underpinnings for a more accessible, secure and ethical way to undertake data science and AI. Examples from the year include: • AI ecosystem survey – conducting the online survey of the AI ecosystem in support of the AI Council in informing the Office for AI’s National AI Strategy, launched in September 2021. • Living with Machines – harnessing the hive mind. More than 1,700 volunteers reviewed digitised Victorian newspapers to glean insights into the human impact of the Industrial Revolution. • UKRI digital research infrastructure report. At the request of UKRI, the Turing led a review of the UK’s digital research infrastructure needs for AI. The consultation process was completed during the year, with the outcomes due to be delivered in 2022-23. Training and skills development The Institute has continued to improve its training and skills development opportunities. Examples from the year include: • The Turing Enrichment scheme, which looks to strengthen the pipeline of skilled UK data science and AI talent through promoting collaborative working on mission-led challenges across disciplinary boundaries for early-career researchers. • Turing Data Study Groups, which won the ‘Academic Engagement of the Year Award’ at the 2021 PraxisAuril KE Awards. Advising the public sector The public policy programme has overseen many research projects dedicated to using data science and AI to inform policy-making and improve public services as well as building ethical foundations for the use of these technologies in the public sector. An example from the year includes: • Working towards a safer internet: carrying out pioneering work in online safety, which included expert input into the UK government’s Online Safety Bill. • Submitting responses to a number of consultations, including (but not limited to): o UKRI’s EDI strategy. o House of Commons Science and Technology Committee’s inquiries into: Diversity in STEM; Reproducibility and research integrity; and The right to privacy: digital data. o House of Lords Science and Technology Committee’s inquiry into the UK research and innovation system and the ‘science superpower’ ambition. o Independent review of the UK research, development and innovation organisational landscape, led by Paul Nurse. London Universities Purchasing Consortium (LUPC) In 2020 the Institute became a member of the consortium whose aim is to achieve value for money for its members in their procurement of goods and services. Stakeholder engagement During the year, the Institute worked closely with its customer stakeholder groups across academia, industry, the third sector, and government, including: • University Partner Board meetings. • Strategic Partner Board meetings. • Regular meetings with UKRI/EPSRC. • Regular engagement with universities through our University Liaison Managers. • Hosting the second annual AI UK showcase. • Holding the Annual General Meeting of member representatives.   4. Community and environment Board considerations The Trustees fully appreciate the impact the Institute has on the community in which it operates and that this is a critical factor in its ongoing success as the national institute for data science and AI. Leading the public conversation The Institute’s Events and Engagement programme is one way the Institute leads the public conversation on data science and AI. Examples include the Turing Lecture series and the second annual AI UK event, hosted by the Turing in March 2022. AI UK is the national showcase of artificial intelligence and data science research and collaboration, with the focus of the 2022 event being on how AI and data science could be used to solve real-world challenges. Community collaboration One of the Institute’s key contributions is bringing together experts with a range of skills and from an extensive range of disciplines – from theoretical mathematics to the social sciences – to tackle problems together. To support this during the year, the Turing invested in redesigning its offices at the British Library to provide a more collaborative and open working environment for the community to use. Links to industry The Institute has a wide variety of active collaborations with organisations in industry, the third sector and government, eight of which are strategic partners: Accenture, Bill and Melinda Gates Foundation, UK Defence & Security (including GCHQ, Dstl and the Ministry of Defence), Lloyd’s Register Foundation, Hoffman-La Roche, Office for National Statistics (ONS), NATS (formerly National Air Traffic Services), and the Singapore Defence and Security Organisation; which are aligned to the Institute’s charitable objects and Research Programmes. Ethical and safe use of digital technology The Institute has made a key contribution through leading the national and global conversation on the ethical, fair and safe use of digital technologies. For example, during the year this has included activities such as: • Informing the government roadmap towards trustworthy AI through advising on the legal, ethical, and technological risks of AI and machine learning. This was a key input to the Centre for Data Ethics and Innovation report ‘The roadmap to an effective AI assurance ecosystem’. • The Turing Way. The online handbook for reproducible, ethical and collaborative data science which is in its third year of use. 5. Principal decisions Principal decisions are those which are material to the Institute and significant to any of our key stakeholders. In making the following principal decisions, the Board considered the outcome from its stakeholder engagement perspective as well as the need to act fairly between the members of the Institute. Principal decision 1: Obtained the approval of the Founder Members to undertake the appointment process for the chairperson of the Board of Trustees. Under the Constitutional Documents the selection and appointment of the chairperson of the Board of Trustees is a matter reserved for the Founder Members. At the Annual General Meeting in July 2021 the Board of Trustees asked the Founder Members to delegate the authority to the Board to undertake the appointment process. The approval was granted and the Board, through the Nomination and Governance Committee, has successfully concluded the appointment, having undertaken an open and competitive selection process. Principal decision 2: Increased oversight of equality, diversity and inclusion (EDI) at board level and the appointment of a Board Champion for EDI. During the year, the scope of the Remuneration Committee was extended to include oversight of EDI matters. This included monitoring of the EDI strategy and action plan for the Institute. This was renamed the Remuneration, EDI and People Committee. Based on the recommendations of this committee during the year, the Board considered and agreed that it was in the best interests of the Institute and its community to maintain its membership of the Stonewall Diversity Champions scheme and approved the appointment of Hitesh Thakrar as the Board Champion for EDI. The Board also took the opportunity during the year to review its committee chairs and decided to appoint new chairs for the Remuneration, EDI and People Committee and the Nomination and Governance Committee, where diversity was a key consideration. Principal decision 3: Revised interim funding model for the University Partners. The Board acknowledged the financial impacts of the current economic environment and the uncertainty around the long-term funding models for our University Partners. As the original grant awards ended in 2021, it was agreed by the Board to implement a 12-month interim model (October 2021 – September 2022) and subsequently a six-month extension (October 2022 – March 2023) reducing the grant contributions from all University Partners. Principal decision 4: Rationalised the advisory committee structure in the short-term whilst the new strategy and funding model is finalised. During the year, the Board stood down the existing membership of the Scientific Advisory Board on the grounds of good governance as they had completed six years of service. The Board identified its aim was for this group to be reconstituted with an open call for new membership during 2022-23. The Board also decided to close down the Commercial Development Board with the aim of reviewing in the future the benefits to the Institute of having a liaison group of industry leaders. Principal decision 5: Investment to refurbish the office space. The Board agreed to extend the existing lease arrangements for the first and fourth floors of the British Library building in Kings Cross for a further two years, until March 2023. Alongside this decision they agreed to invest in a refurbishment of the office space on the 1st floor to provide a more collaborative and open environment for the Turing community to use post COVID-19. Principal decision 6: Improvements to the control environment. During the year, the Board instigated improvements to the control environment through reviewing, updating and approving key corporate policies including safeguarding, research ethics, bullying and harassment, and tax evasion, anti-money laundering and terrorist financing. The Board also reviewed and reapproved the risk management framework.   Charity Governance Code The Charity Governance Code (the “Code”) has been developed as an ‘aspirational’ model to support continuous improvement in governance. The Trustees confirmed in 2019-20 their support for the principles-based approach of the Code, agreeing to undertake an annual internal review of governance practice at the Institute. The internal review of governance for 2021-22, which considered current practice, concluded that there had been improvements made during the year in areas identified in the previous year’s review. These included policy development, equality, diversity and inclusion, board diversity, and board and committee composition with further improvements identified to move the Institute towards best practice in governance as set out in the Code. Relevant areas of particular focus this year have included: Board leadership The Institute is a charity governed by its Articles of Association, adopted in 2015, and a Joint Venture Agreement (“JVA”) with the six founding members of the Institute. The JVA sets out the requirements for the appointment of the trustees, who are the directors of the charity. This enables each of the Founding Members to appoint one trustee to the Board with the requirement that the Founder Member appointed Trustees are in the majority at all times on the Board. During the year, an open and transparent process was commenced to recruit and appoint a new Chair of the Board of Trustees. In addition to this, there were various examples of good practice including a review of the committee membership with two new committee chairs appointed, taking account of Board diversity. Approval was also granted by the Founder Members for their appointed representatives on the Board to be subject to best practice governance of six-year terms of office. Equality, diversity, and inclusion (EDI) The Institute recognises that promoting and embedding EDI in our function as employer, research institute and national body is integral to achieving our mission. As an enabler of this, our first EDI strategy and action plan was launched in 2021. The ownership for delivery of the strategy and action plan resides with the Senior Management Team, with the Remuneration, EDI and People Committee providing oversight and holding the Institute to account for delivery of performance against the plan. Examples of outputs from the Board in the first year of the plan included the appointment of Hitesh Thakrar as the Board Champion for EDI, and the Remuneration, EDI and People Committee receiving the first annual report on EDI. Further information on the wider EDI activity at the Institute is available in section 1. Strengthening the Executive Team The appointment of Mark Girolami as the Institute’s Chief Scientist was a key step in the process of delivering a science and innovation strategy during a period of evolution at the national level with the launch of the National AI Strategy and other related strategies. The Chief Scientist has also taken responsibility for leading the Programme Directors and is the Institute’s main point of contact with the advisory and liaison groups that advise the Institute on scientific strategy. Internal controls: Policy framework The Trustees have overseen the implementation and updating of key corporate policies that have strengthened the control environment of the Institute during the year. These have included: safeguarding, bullying and harassment, research ethics, and tax evasion, anti-money laundering and terrorist financing policies, among others. Areas of focus for 2022-23 Having undertaken the annual review of the Institute’s governance practices, when compared with the best practice recommendations of the Code, the Board acknowledges the need to focus on delivering continuous improvements and embedding the good practice that has been put in place during this year. Examples of the activities to be focused on during 2022-23 include: Developing an overarching Institute strategy aligned to the national strategy, with a science and innovation strategy being a key strand. Reviewing and updating Board member induction, appraisal and review processes. Updating key corporate policies that are due for renewal in 2022-23, including Conflicts of Interest and the Code of Conduct. Trustees’ responsibilities statement The Trustees are responsible for preparing the Trustees’ annual report and the financial statements in accordance with applicable law and regulations. Company law requires the Trustees to prepare financial statements for each financial year. Under that law, the Trustees have elected to prepare the financial statements in accordance with United Kingdom Accounting Standards (United Kingdom Generally Accepted Accounting Practice, GAAP) including FRS 102 – The Financial Reporting Standard Applicable in the UK and Ireland. Under company law, the Trustees must not approve the financial statements unless they are satisfied that they give a true and fair view of the state of affairs of the Institute and the result for that year. In preparing these financial statements, the Trustees are required to: Select suitable accounting policies and then apply them consistently. Comply with applicable accounting standards, including FRS 102, subject to any material departures disclosed and explained in the financial statements. State whether a Statement of Recommended Practice (SORP) applies and has been followed, subject to any material departures which are explained in the financial statements. Make judgements and estimates that are reasonable and prudent. Prepare the financial statements on a going concern basis unless it is inappropriate to presume that the charitable company will continue in business. The Trustees are responsible for keeping adequate accounting records that are sufficient to show and explain the Institute’s transactions, disclose with reasonable accuracy at any time the financial position of the Institute and enable them to ensure that the financial statements comply with the Companies Act 2006. They are also responsible for safeguarding the assets of the Institute and hence for taking reasonable steps for the prevention and detection of fraud and other irregularities. Trustees are responsible for the maintenance and integrity of the corporate and financial information included on the Institute’s website. Legislation in the UK governing the preparation and dissemination of financial statements may differ from legislation in other jurisdictions.   Disclosure of information to the auditor The Trustees who held office at the date of approval of this Trustees’ annual report confirm that, so far as they are each aware, there is no relevant audit information of which the Institute’s auditor is unaware. Each Trustee has taken all the steps that they ought to have taken as a Trustee to make themselves aware of any relevant information and to establish that the Institute’s auditor is aware of that information. Signatory The Trustees’ annual report is approved by the Trustees of the Institute. The strategic report, which forms part of the annual report, is approved by the Trustees in their capacity as directors in company law of the Institute. Vanessa Lawrence CB Trustee 22 June 2022   Section 3 Financial statements We have audited the financial statements of The Alan Turing Institute for the year ended 31 March 2022 which comprise the Group Statement of Financial Activities, the Group and Parent Charitable Company Balance Sheets, the Group Cash Flow Statement and notes to the financial statements, including a summary of significant accounting policies. The financial reporting framework that has been applied in their preparation is applicable law and United Kingdom Accounting Standards, including Financial Reporting Standard 102 The Financial Reporting Standard applicable in the UK and Republic of Ireland (United Kingdom Generally Accepted Accounting Practice). In our opinion the financial statements: • give a true and fair view of the state of the group’s and the parent charitable company’s affairs as at 31 March 2022 and of the group’s incoming resources and application of resources, including its income and expenditure, for the year then ended; • have been properly prepared in accordance with United Kingdom Generally Accepted Accounting Practice; and • have been prepared in accordance with the requirements of the Companies Act 2006. Basis for opinion We conducted our audit in accordance with International Standards on Auditing (UK) (ISAs(UK)) and applicable law. Our responsibilities under those standards are further described in the Auditor’s Responsibilities for the audit of the financial statements section of our report. We are independent of the charitable company in accordance with the ethical requirements that are relevant to our audit of the financial statements in the UK, including the FRC’s Ethical Standard, and we have fulfilled our other ethical responsibilities in accordance with these requirements. We believe that the audit evidence we have obtained is sufficient and appropriate to provide a basis for our opinion. Conclusions relating to going concern In auditing the financial statements, we have concluded that the Trustees’ use of the going concern basis of accounting in the preparation of the financial statements is appropriate. Based on the work we have performed, we have not identified any material uncertainties relating to events or conditions that, individually or collectively, may cast significant doubt on the charitable company’s ability to continue as a going concern for a period of at least twelve months from when the financial statements are authorised for issue. Our responsibilities and the responsibilities of the Trustees with respect to going concern are described in the relevant sections of this report. Other information The other information comprises the information included in the annual report, other than the financial statements and our auditor’s report thereon. The Trustees are responsible for the other information. Our opinion on the financial statements does not cover the other information and, except to the extent otherwise explicitly stated in our report, we do not express any form of assurance conclusion thereon. In connection with our audit of the financial statements, our responsibility is to read the other information and, in doing so, consider whether the other information is materially inconsistent with the financial statements or our knowledge obtained in the audit or otherwise appears to be materially misstated. If we identify such material inconsistencies or apparent material misstatements, we are required to determine whether there is a material misstatement in the financial statements or a material misstatement of the other information. If, based on the work we have performed, we conclude that there is a material misstatement of this other information, we are required to report that fact. We have nothing to report in this regard. Opinions on other matters prescribed by the Companies Act 2006 In our opinion, based on the work undertaken in the course of the audit: • the information given in the strategic report and the Trustees’ annual report for the financial year for which the financial statements are prepared is consistent with the financial statements; and • the strategic report and the Trustees’ annual report have been prepared in accordance with applicable legal requirements. Matters on which we are required to report by exception In the light of the knowledge and understanding of the group and parent charitable company and its environment obtained in the course of the audit, we have not identified material misstatements in the strategic report or the Trustees’ annual report. We have nothing to report in respect of the following matters where the Companies Act 2006 requires us to report to you if, in our opinion: • the parent charitable company has not kept adequate and sufficient accounting records, or returns adequate for our audit have not been received from branches not visited by us; or • the parent charitable company’s financial statements are not in agreement with the accounting records and returns; or • certain disclosures of Trustees’ remuneration specified by law are not made; or • we have not received all the information and explanations we require for our audit. Responsibilities of Trustees As explained more fully in the Trustees’ responsibilities statement set out on page 73, the Trustees (who are also the directors of the charitable company for the purposes of company law) are responsible for the preparation of the financial statements and for being satisfied that they give a true and fair view, and for such internal control as the Trustees determine is necessary to enable the preparation of financial statements that are free from material misstatement, whether due to fraud or error. In preparing the financial statements, the Trustees are responsible for assessing the group and parent charitable company’s ability to continue as a going concern, disclosing, as applicable, matters related to going concern and using the going concern basis of accounting unless the Trustees either intend to liquidate the group or parent charitable company or to cease operations, or have no realistic alternative but to do so. Auditor’s responsibilities for the audit of the financial statements Our objectives are to obtain reasonable assurance about whether the financial statements as a whole are free from material misstatement, whether due to fraud or error, and to issue an auditor’s report that includes our opinion. Reasonable assurance is a high level of assurance, but is not a guarantee that an audit conducted in accordance with ISAs (UK) will always detect a material misstatement when it exists. Misstatements can arise from fraud or error and are considered material if, individually or in aggregate, they could reasonably be expected to influence the economic decisions of users taken on the basis of these financial statements. As part of an audit in accordance with ISAs (UK) we exercise professional judgement and maintain professional scepticism throughout the audit. We also: • Identify and assess the risks of material misstatement of the financial statements, whether due to fraud or error, design and perform audit procedures responsive to those risks, and obtain audit evidence that is sufficient and appropriate to provide a basis for our opinion. The risk of not detecting a material misstatement resulting from fraud is higher than for one resulting from error, as fraud may involve collusion, forgery, intentional omissions, misrepresentations, or the override of internal control. • Obtain an understanding of internal control relevant to the audit in order to design audit procedures that are appropriate in the circumstances, but not for the purposes of expressing an opinion on the effectiveness of the group and parent charitable company’s internal control. • Evaluate the appropriateness of accounting policies used and the reasonableness of accounting estimates and related disclosures made by the Trustees. • Conclude on the appropriateness of the Trustees’ use of the going concern basis of accounting and, based on the audit evidence obtained, whether a material uncertainty exists related to events or conditions that may cast significant doubt on the group and parent charitable company’s ability to continue as a going concern. If we conclude that a material uncertainty exists, we are required to draw attention in our auditor’s report to the related disclosures in the financial statements or, if such disclosures are inadequate, to modify our opinion. Our conclusions are based on the audit evidence obtained up to the date of our auditor’s report. However, future events or conditions may cause the group or parent charitable company to cease to continue as a going concern. • Evaluate the overall presentation, structure and content of the financial statements, including the disclosures, and whether the financial statements represent the underlying transactions and events in a manner that achieves fair presentation. • Obtain sufficient appropriate audit evidence regarding the financial information of the entities or business activities within the group to express an opinion on the consolidated financial statements. We are responsible for the direction, supervision and performance of the group audit. We remain solely responsible for our audit report. We communicate with those charged with governance regarding, among other matters, the planned scope and timing of the audit and significant audit findings, including any significant deficiencies in internal control that we identify during our audit. Explanation as to what extent the audit was considered capable of detecting irregularities, including fraud Irregularities, including fraud, are instances of non-compliance with laws and regulations. We design procedures in line with our responsibilities, outlined above, to detect material misstatements in respect of irregularities, including fraud. The extent to which our procedures are capable of detecting irregularities, including fraud is detailed below. The objectives of our audit in respect of fraud, are: to identify and assess the risks of material misstatement of the financial statements due to fraud; to obtain sufficient appropriate audit evidence regarding the assessed risks of material misstatement due to fraud, through designing and implementing appropriate responses to those assessed risks; and to respond appropriately to instances of fraud or suspected fraud identified during the audit. However, the primary responsibility for the prevention and detection of fraud rests with both management and those charged with governance of the charitable company. • We obtained an understanding of the legal and regulatory requirements applicable to the charitable company and considered that the most significant are the Companies Act 2006, the Charities Act 2011, the Charity SORP, and UK financial reporting standards as issued by the Financial Reporting Council. • We obtained an understanding of how the charitable company complies with these requirements by discussions with management and those charged with governance. • We assessed the risk of material misstatement of the financial statements, including the risk of material misstatement due to fraud and how it might occur, by holding discussions with management and those charged with governance. • We inquired of management and those charged with governance as to any known instances of non-compliance or suspected non-compliance with laws and regulations. • Based on this understanding, we designed specific appropriate audit procedures to identify instances of non-compliance with laws and regulations. This included making enquiries of management and those charged with governance and obtaining additional corroborative evidence as required. There are inherent limitations in the audit procedures described above. We are less likely to become aware of instances of non-compliance with laws and regulations that are not closely related to events and transactions reflected in the financial statements. Also, the risk of not detecting a material misstatement due to fraud is higher than the risk of not detecting one resulting from error, as fraud may involve deliberate concealment by, for example, forgery or intentional misrepresentations, or through collusion. Use of our report This report is made solely to the charitable company’s members, as a body, in accordance with Chapter 3 of Part 16 of the Companies Act 2006. Our audit work has been undertaken so that we might state to the charitable company’s members those matters which we are required to state to them in an auditor’s report and for no other purpose. To the fullest extent permitted by law, we do not accept or assume responsibility to any party other than the charitable company and charitable company’s members as a body, for our audit work, for this report, or for the opinions we have formed. For and on behalf of: Moore Kingston Smith LLP Statutory Auditor Date: 29 June 2022 6th Floor 9 Appold Street London, EC2A 2AP [Consolidated Statement of Financial Activities, Balance Sheet, Consolidated Statement of Cash Flows and Notes to the financial statements not included in this plain text version]