Tris was a placement student in the 2023/24 Enrichment scheme.
Convincing someone to change their beliefs is notoriously challenging, as anyone who has tried can attest. A substantial body of evidence from cognitive science indicates that humans are remarkably resistant to changing their minds – even when faced with contradictory evidence. As a psychologist working on belief systems, I’m trying to understand why we’re so resistant to changing our minds and how I can apply data science and AI to tackle this fascinating and complex problem.
How are our beliefs built?
When examining human belief systems, it’s crucial to consider that our beliefs do not exist in a vacuum; rather, they are part of larger systems of ideas. These interconnected idea systems, known as ‘intuitive theories’ or ‘cognitive models’, are made up of concepts, and links between causes and effects, that shape our understanding and expectations of the world around us.
Some beliefs tend to coexist. For example, those who doubt the reality of human-caused climate change are more likely to trust alternative medicine. Similarly, scepticism towards vaccines often aligns with a belief in the superiority of parental intuition over medical expertise. These interconnected beliefs reinforce one another, making them resilient to contradictory evidence.
Exploring how beliefs interact with and support one another provides insights into how to introduce new information or perspectives that can challenge and potentially reshape rigid belief systems. This research can in turn inform approaches to changing people’s minds in ways that could benefit us all. In the case of vaccination, for instance, changing people’s attitudes could help increase vaccination rates and ultimately save lives.
Why do we struggle to change our beliefs?
So far, research on belief change has focused on studying how we update our beliefs in response to new information we receive. But this research has mostly not accounted for the larger systems of ideas in which our beliefs are situated and the fact that individual beliefs do not change in isolation.
Existing research has, however, been valuable in shedding light on some key features of this process. For example, we might expect that when we encounter new information, we should use it to inform and update our beliefs. In reality, though, our brains often don’t follow this ‘rational’ path. As a general rule, we are more willing to embrace new information if we regard it as positive news and ignore information that we view as negative news. We are also particularly resistant to changing beliefs that challenge our self-image and those connected to parts of our identity. This is where a well-known phenomenon known as ‘confirmation bias’ comes into play. Confirmation bias refers to our tendency as humans to stick to our original beliefs and cognitive patterns.
My research with the Turing focuses on studying belief change within the context of wider systems of belief. One intriguing finding from our research programme is that people often create ‘auxiliary hypotheses’ to resolve conflicting information – instead of changing our minds, we come up with unverified propositions to support our existing beliefs. For example, suppose someone believes both that a particular medical treatment works and in a conspiracy theory that suggests the medical community is hiding the truth about this treatment. When faced with evidence that the treatment is ineffective, instead of rejecting either belief, they might create an auxiliary hypothesis speculating that the treatment works but that the medical community’s supposed secrecy is preventing it from being used correctly. In this case, the additional proposition allows them to keep believing in both the treatment’s efficacy and the conspiracy theory. More generally, what our research suggests is that interventions aimed at changing people’s minds need to be designed to address entire belief systems rather than isolated beliefs.
How can we study belief change in the real world?
We bring our core experimental research to bear in the real world by connecting it with studies on belief change using datasets from social media. A notable project revolves around analysing data from the Reddit forum ‘Change My View’, where users engage in structured debates with the goal of persuading each other to reconsider their beliefs. This resource contributes to our research programme in two significant ways. Firstly, by analysing anonymised interactions and outcomes collected from this public platform, we glean insights into the strategies and mechanisms facilitating belief change in real-life scenarios. Secondly, this resource allows us to understand the formation and revision of entire belief networks, enabling us at the same time to study how a change in one belief affects related beliefs about the same or seemingly unrelated topics. Using this data, we can reveal how beliefs about sociopolitical issues, religion, science, interpersonal relationships and other issues are connected, and influence each other.
To effectively process and analyse vast amounts of social media text data, we leverage advanced natural language processing (NLP) techniques and machine learning algorithms. Through these tools, we construct computational models that capture belief dynamics in the real world, providing valuable insights into human cognition. Previous work in this area has, for example, highlighted the importance of providing evidence and using friendly language when trying to convince someone to change their mind.
Integrating findings from experimental studies with observations from online forums allows for a more comprehensive understanding of how beliefs are formed, challenged, and potentially transformed, in diverse social contexts. By getting a handle on how belief change works at a fundamental level, we can gain a better understanding of why it’s so hard for us to change our minds about everyday issues such as climate change, alternative medicine, vaccines, and more. This deeper insight can then inform the development of more effective strategies for addressing misinformation and promoting behaviour change that benefit society as a whole.
At AI UK 2024, Tris’s talk ‘Facts Don't Change Minds’ won PitchFest - a competition where researchers tell 90-second stories about their work. Watch her winning talk below:
Top image: lunamarina