The UK’s controversial Online Safety Bill (OSB) is set to bring in a raft of new digital regulations and offences. Despite its imminent ratification, however, uncertainties over the future of online safety remain; ongoing debate between government and the tech industry about how to create ‘privacy-by-design’ (PBD) technologies that preserve internet users’ privacy without compromising public safety or national security are at an impasse. At the Centre for Emerging Technology and Security (CETaS), based at The Alan Turing Institute, we have been carrying out research that offers insights into how different stakeholders in this debate can work together to achieve meaningful change.
In a new report from CETaS, our team provides recommendations for fostering a more inclusive and constructive approach to future PBD technologies – which embed data protection considerations at the heart of their designs. In doing so, we hope to reduce the potential exploitation of these tools in online threats such as cyber crime, disinformation and grooming cases – thereby securing internet users’ safety and human rights.
The forthcoming OSB would empower the UK’s existing communications regulator, Ofcom, to ensure that social media platforms and encrypted communication services (e.g. WhatsApp) regulate illegal and harmful content. But specific provisions of the OSB legally mandating these companies to identify and remove such content have received major backlash. Of particular concern are clauses on the mandated scanning and assessment of private messages by service providers before they are allowed to be sent. Whilst the UK government recently announced that it is postponing these requirements – until scanning technology exists that is capable of meeting them without infringing on wider user privacy – tech companies still argue that their existence may erode digital rights in the future. Indeed, service providers including WhatsApp and Signal announced that they may even withdraw their platforms from the UK if the OSB passes into law.
For its part, the UK government has openly criticised the tech industry for both failing to design encryption features that enable law enforcement to access content when it has legal authorisation to do so (termed ‘lawful access’) and for providing malicious actors with the anonymity required to operate with significant impunity. This latter aspect has been the subject of much debate, particularly with regard to the proliferation of child sexual abuse material linked to encrypted platforms.
Yet amid this complex range of concerns, CETaS research reveals a concerning lack of constructive dialogue or inclusive engagement proposed to find an acceptable compromise. In the existing literature, it is repeatedly claimed that disagreements over encryption designs are simply impossible to resolve. However, accepting such an outcome is dangerous. If we do so, threats to users and organisations may only worsen, while the potential benefits of multi-stakeholder collaboration in improving online safety will go unrealised.
It is within this context that the Turing sought to overcome the current impasse, by utilising the Institute’s unique convening power among academics, policy makers, law enforcement and industry. By engaging with people from different sides of the debate, it became clear that – contrary to what the literature suggests – there is widespread agreement that all stakeholders can do more to help find a compromise.
The government officials we engaged with argued that industry stakeholders, as well as internet standards bodies, need to make their organisational processes more inclusive. They highlighted that existing procedures for designing new features or standards often involve minimal input from impacted parts of government, such as law enforcement or civil society. To address these shortfalls, we recommend introducing a voluntary certification scheme for PBD technology that encourages developers to work more closely with other stakeholder groups before releasing new features. Encryption designs incorporating a diversity of inputs would receive financial rewards from the government and give companies a beneficial marketing angle for their product. Additionally, we suggest facilitating more secondments and training opportunities between law enforcement and industry, to increase trust and foster greater understanding.
The industry representatives we engaged with also emphasised that their research processes would benefit from increased government transparency on safety issues surrounding proposed PBD technologies. Our report therefore recommends declassifying impact assessments, which specify the perceived risks to national security from individual designs, so that they can be publicly released and shared with other stakeholders. We also recommend reducing the technical complexity of ‘lawful access’ proposals – which seek to preserve the benefits of encryption while reducing the challenges it poses to criminal investigations – alongside the complexity of infrastructure and policies needed to support their implementation.
Much of the concern with the incoming OSB centres on fierce debates over trade-offs between fundamental human rights, most notably privacy and security. Yet the key finding from our research is that the very nature of this debate is flawed. The dichotomous framing of privacy and security as competing rights is oversimplified, moving us away from other aspects of the problem which offer more realistic opportunities for collaboration. The focus needs to be shifted. Rather than trying to arrive at a perfect outcome on PBD technologies, we should instead allow all stakeholders to articulate their concerns during the early development stage, creating a far better chance that risks to different human rights can be reduced to an acceptable level before a new design is introduced. Thus, it is an openness to compromise among all sides, not an attempt to force consensus, that is desperately needed to safeguard online users for years to come.
Read our report:
The Future of Privacy by Design Technology: Policy Implications for UK Security
Top image: cherdchai