People overestimate the change technology brings in the short term and underestimate its long-term effects, said Gordon McKee MP, while opening a parliamentary event held this month on behalf of the Alan Turing Institute’s Centre for Emerging Technology and Security (CETaS).
The event featured an expert panel discussing the impact of AI-enabled mis- and disinformation on recent elections throughout 2024.
The panel, chaired by the Director of CETaS Alexander Babuta, included Vijay Rangarajan, the CEO of the Electoral Commission; Ciaran Martin, former CEO of the National Cyber Security Centre (NCSC); Marianna Spring, the BBC’s disinformation and social media correspondent, and Sam Stockwell, Research Associate at the Alan Turing Institute.
Sam Stockwell, who analysed AI-enabled mis- and disinformation in major global elections last year said that “the acute risk that many feared at the start of last year” of election deepfakes that are realistic enough to sway undecided voters to vote a certain way and undermine the democratic process never came to pass. His research found no evidence that such content meaningfully impacted election results.
And where examples were found, those sharing this information were usually already ideologically affiliated with the content – for example with Donald Trump supporters sharing deepfakes of Kamala Harris to other likeminded users. But while this content may not change minds of undecided votes, it can still do damage by deepening existing partisan divides and echo chambers.
Marianna Spring linked this issue to the way social media algorithms promote content, rather than the existence of the harmful content itself. She said: “It all comes back to the fundamental design of the social media sites. The way that algorithms work and the kind of content they push – they favour reactions, emotions, engagement over safety... lots of the insiders I speak to at these companies confirm that.”
She added: “The companies argue they have the necessary checks and balances to protect users and to protect freedom of expression,” but ultimately when guardrails are removed more users are exposed to this kind of content.
Noting that it is the level of exposure, rather than the existence of the content, which leads to impact, she said: “If it weren’t for the algorithms, there probably wouldn't be nearly as many people seeing this kind of stuff, and it wouldn’t be affecting political conversation and debate in the way it is.”
She believes “there's a really interesting and important conversation to have right now about whether we're entering a new social media era under Donald Trump because of the relationship he appears to have with several of the social media CEOs,” many of whom were in attendance at the inauguration. “These social media companies are huge; they’re often very hard to hold to account.”
Ciaran Martin said he has heard concerns from people who feel very strongly that there has been real and systemic harm already to the UK and other democracies, but he does not believe there is enough evidence to confirm this yet.
“Confidence in democracy is really precious. In the UK in the last 12 months we’ve just had a peaceful and orderly transfer of power between two different parties. Speculating without hard evidence that there's been a serious undermining of that is not cost free. So, that's one of the reasons why I approach this with a little bit of trepidation.”
But he does have some concerns. This includes the sharing of localised mis- or disinformation through Facebook groups, for example, in order to destabilise communities – which can sometimes be a goal in itself. He said: “details matter and confidence in democracy matters.”
Vijay Rangarajan highlighted two key concerns: abuse and intimidation of candidates, and trust in elections. He said “roughly 80% of the public trust the electoral process in the UK – that is precious but potentially a shallow and fragile thing and that is our real aim – to try to maintain that.” The focus, he said, should be wider than just the elections themselves because “most politics happens between elections.”
Expressing concern about the use of deepfakes to target individual candidates, primarily women and ethnic minority candidates, he said, “it's really nasty and it spreads through local Facebook groups, WhatsApp groups, and so on. It's very hard to find. And we haven't yet succeeded in finding some of the perpetrators, so there is a big concern about how we track this illegal content down and find out who is making it. Abuse and intimidation is fundamentally a threat to our democracy. Both online and offline attacks cause people a lot of stress and harm, which can put them off wanting to stand as a candidate. This impacts the diversity of candidates, and means voters have less choice.”
For more information about CETaS work on election security read Recent research: