Pamela Ugwudike

Pamela Ugwudike's research considers the ethics and social implications of AI technologies when applied in justice systems to predict crime risks and determine levels of criminal justice intervention

What are you currently working on?­

I am currently researching AI ethics and accountability, with a focus on data bias and the implications for criminal justice. This forms the basis of the Turing project I am leading, which is exploring the predictive mechanisms underpinning a machine learning algorithm used by some police services to forecast future crime locations. The project’s primary objective is to identify possible conduits of bias and corrective measures. To do this, the project uses a computational model that replicates the algorithm to run large-scale tests and observe correlations between data inputs and outputs (predictions).

Similarly, another project I’m leading is investigating the relationship of AI and criminal justice. The study, funded by the Web Science Institute (WSI) at the University of Southampton, is exploring how platform algorithms integrated into Online Social Networking Sites (ONSs) interact with broad social factors to structure discourses about crime and punishment.

What first got you interested in your field of research?

Leading two research projects on the benefits of digitisation for criminal justice services initially led me to become interested in AI ethics and accountability. I found that data-related issues can prompt predictive algorithms to entrench the belief that social categories such as race and social status are somewhat linked to risk of recidivism. An example is the tendency of some predictive algorithms to obscure problems such as racial bias and infer from patterns in datasets that high rates of arrest are risk predictors. An additional example is the inference that structural problems, for example, poor access to education and employment are also risk predictors.

What aspect of your work is most exciting you right now?

I find the interdisciplinarity of my current projects particularly exciting, as they bring together researchers from social, computer, and data science disciplines. This makes it possible to bridge gaps between the technical and the social, demonstrating how innovative interdisciplinary methods can be when used to study pressing social problems.

What three words would you use to describe your work?

AI ethics, data bias, digital society.

What book/s does everyone need to be aware of?

  • My forthcoming book with Professor Ros Edwards: Governing Families with Technology, to be published next year by Routledge.
  • Ruha Benjamin’s (2019) Race After Technology: Abolitionist Tools for the New Jim Code, published by Cambridge: Polity Press.
  • Victoria Eubanks’ (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor, publishers: New York: St Martins Press
  • Safiya Umoja Noble’s (2018) Algorithms of Oppression, published by New York, New York University Press.

When not working what can you be found doing?

Reading! My preferences are historical fiction and literary/narrative non-fiction. I am also a keen runner.