Children and AI team

Can you introduce yourselves and your backgrounds?  

Mhairi Aitken: I’m a Senior Ethics Fellow in the public policy programme. My background is in sociology and for more than 15 years my research has focused on public engagement with science and technology. A lot of my previous work has involved engaging members of the public in discussions around technology policy and innovation, but children are very rarely included in these processes. I am proud that our team is doing something about that!

Morgan Briggs: I am a Research Associate for Data Science and Ethics. I bring formal training in both data science and ethics to our research. I have worked across various sectors, always finding ways to centre responsible AI practices and human rights principles. Mhairi and I launched the children and AI team in 2021.

Sabeehah Mahomed: I’ve been an Ethics Researcher with the team since joining the Turing in 2022. I bring an interdisciplinary perspective as my background includes the humanities, technology, entrepreneurship and education. I completed my MSc degree in Digital Humanities at UCL which is where I became increasingly interested in the intersection of technology and society.  

Members of the Children and AI team standing next to banners related to their work

Tell us about the work your team does

Mhairi: Our work seeks to advance child-centred approaches to AI by putting children’s rights and voices at the heart of the design, development, deployment and governance of AI. We engage directly with children to bring them into conversations about this technology.

Since 2022 we have been collaborating with the Children’s Parliament and Scottish AI Alliance to partner with schools and directly engage children aged 8 – 12, to explore children’s relationships with AI, and how they want to be involved in shaping the future of AI. We’re now expanding this work to develop, test and demonstrate best practice in children’s engagement with AI. We have also collaborated with UNICEF to pilot their policy guidance on AI for children, and with the Council of Europe’s Steering Committee on the Rights of the Child to map existing legal frameworks addressing children’s rights in the context of AI.  

Why is focusing on children and AI important?

Morgan: Children are increasingly interacting with AI on a daily basis. They have a unique set of needs compared to adult users, but they’re also the group that are least represented in decision-making around the design, development, and deployment of AI systems, and in discussions around AI policy and regulation. We believe that needs to change, and that’s a big motivation driving the work of our team.

Mhairi: It’s important to recognise that no adult today has experienced growing up in a world with Generative AI, so children really are the only experts in the ways that children interact with these technologies. They need to be part of the conversations to ensure that policies or regulations developed are based on understandings of children's actual needs and experiences.

   

What are you most excited about in the area? What are the biggest challenges?  

Sabeehah: It’s great to see how quickly children grasp emerging technologies, and I’m intrigued to see what they may grow up to do in this area. One of the main challenges is that the full impact of these technologies on children, their rights, and wellbeing may only be understood retrospectively.  

Mhairi: It’s been exciting to see how engaged and enthusiastic the children we’ve worked with are about being involved in discussions around AI. They are not only capable of understanding complex concepts, like how AI systems are developed, or the importance of addressing bias and fairness, but they have fantastic ideas about how to maximise the value of AI and mitigate its risks. The biggest challenge isn’t engaging children in this space, but ensuring that decision-makers, including developers, policymakers and regulators, listen to the voices of children and meaningfully take their views on board.

Can you share something from your research that might surprise people?  

Sabeehah: Adults are always surprised at how well children grasp concepts like AI and machine learning, which are often assumed to be too difficult for them.  

And children want to be involved! As one learner said, “Just because we’re children, doesn’t mean we don’t know what we’re talking about. Doesn’t mean we don’t understand. We know what we’re talking about because we’ve learnt. Our voices need to be heard as well, because sometimes it’s only us [children] that are trying, and only us [children] that are listening.” 

What role does public engagement play in your research?  

Mhairi: Public engagement is a crucial part of this work. Our research is only possible through directly engaging children, but it’s also really important that we engage more widely with the public to share what the children are telling us and to advocate for their interests while children themselves remain underrepresented in policy, regulation and innovation.  
We are always looking for new ways to bring children into discussions around AI and to ensure that policymakers, researchers, developers and regulators know that engaging children in this area  is essential. Ultimately, we want people to know that child-centred AI is better AI!

Find out more about the work of the children and AI team