False information about COVID-19 has been described as an “infodemic” by the WHO Director-General, and there are concerns that it can lead people to ignore official guidance, avoid getting vaccinated, or even to use harmful ‘miracle’ cures. Numerous projects have been established over the past year to find, flag and monitor misleading, online health-related content, but far less research has investigated who believes the content in the first place, and why.
In our project at The Alan Turing Institute, funded by The Health Foundation, we used a mix of surveys and assessments to address this gap. Identifying who is most vulnerable to misinformation is crucial for building a deeper understanding of the problem, and for developing more targeted, effective interventions to tackle its root causes. Otherwise, we risk deploying overly draconian, broad and restrictive policies to address misinformation, such as banning content from certain websites. Through our research, we hope to contribute to ongoing debates amongst policymakers and regulators about how best to tackle the problem.
In the study, we asked a panel of 1,700 people, representative of the UK in terms of age and gender, to complete a detailed survey about their personal background, outlook and experiences, and tested their personality, cognitive skills and different literacies. We then presented them with various headline-style health-related claims about COVID-19, measuring their vulnerability to misinformation based on how accurately they assessed the claims. Some of these claims were true (e.g. “COVID-19 can spread through the air”) and some false (e.g. “The COVID-19 virus can be treated by drinking lemonade”).
Our results show that individuals with lower digital literacy, numerical literacy, health literacy and cognitive skills fare worse at assessing the veracity of health-related statements. Unexpectedly, most sociodemographic, socioeconomic and political factors made little or no difference. These are important results as they mean that developing people’s cognitive skills and literacies could make a big difference to their ability to identify misinformation. This would also have positive benefits beyond just tackling misinformation, especially digital literacy (the ability to use digital technologies to find, evaluate and communicate information), which research shows is crucial to navigating modern life. This leads us to our first recommendation:
Digital literacy should be explored as a powerful tool for combatting misinformation.
In our study, we demonstrated the power of misinformation by asking people to assess the same claims, but preceded by different types of related content. As expected, those who had been shown true content before they assessed the claim fared better than those who had been shown false content. However, surprisingly, giving participants warnings about misinformation before they made their assessments had only a very small impact on their performance. This leads us to our second recommendation:
New strategies for communicating the severity of misinformation to the public are urgently needed.
Finally, we found that individuals differ greatly in their knowledge of health, but that almost everyone has ‘room to improve’. Government policies should aim to enable people to better recognise health-related misinformation, and encourage them to scrutinise content that they are unsure about. Whilst public discourse is often focused on reducing the supply of misinformation (an admittedly important way to tackle the problem), our research draws attention to reducing people’s vulnerability. Our third recommendation is:
Address the factors that make people susceptible to misinformation.
We have started to identify these factors in our project, but there needs to be far more research if we are to stem the flow of misinformation online. The health, social and economic consequences of COVID-19 are already devastating – we need to minimise the potential of misinformation to make them even worse, and plan for how to limit the effects of misinformation in future public health crises. Read the full paper here, and if you have any questions, please contact me.
Watch: Instagram Live replay
Watch myself and report co-author Becky Inkster in conversation on Instagram, discussing how we can better understand vulnerability to online misinformation, who is more likely to believe it – and what can be done to tackle the problem.
Lead image: zimmytws / Shutterstock