Differences in attitudes towards AI-related privacy, agency and trust between Japan and the UK

First interim report from the PATH-AI collaboration between the Turing, the University of Edinburgh and RIKEN

Thursday 02 Dec 2021

Path AI report coverA new report published by the PATH-AI project today, Thursday 2 December 2021, details initial findings from interviews and surveys from the UK and Japan about attitudes to privacy, agency and trust in relation to AI and digital technologies.  

The researchers asked 95 participants, including experts and members of the public, about three different examples of data-intensive technologies in the areas of health and social care: digital contact tracing apps, medical symptom checking tools, and care robots.  

The decision to focus on these case studies was driven in part by the COVID-19 pandemic, during which such tools have gained prominence as ways to reduce the spread of infection and cope with the shifting demands placed on healthcare systems. These are technologies that potentially offer huge benefits, but also raise many questions about ethics and governance. 

There were significant areas of agreement among research participants across both countries. For example, both British and Japanese respondents raised privacy concerns about the early decision by both national governments to develop digital contact tracing apps with centralised databases, which were later abandoned in favour of a privacy-friendly decentralised approach. Similarly, both groups expressed worries about how data gathered by symptom checking and other digital health tracking tools might be used or monetised.  

There was also a widespread sense of growing asymmetries – of data, informed choice, resources, and ultimately power – between users, governments, and companies. Many respondents felt confused about what data was being collected about them and how it was being used, fuelling a feeling of disempowerment, distrust and lack of agency. The recent focus on data privacy can be seen partly as a response to a growing warinesstowards both government policies and technology companies, but several experts noted that legislation has so far tended only to confuse citizens while not preventing companies from collecting ever-greater amounts of personal data. 

A few key differences in perspectives, however, did surface. For example, Japanese respondents were overall more optimistic and seemed more comfortable with the idea of, for example, interacting with robots – with 90% of respondents answering that care robots were a good idea. In contrast, UK respondents were less sure about what the role of robots should be and were in complete agreement that care robots should only ever be used as a supplement rather than a substitute for human care. 

Additionally, some UK participants worried that emerging technologies might cause harm due to inadequate design and implementation whilst many Japanese participants seemed more worried that they would work too well or become too powerful, introducing dangers of lack of control or the creation of a future society ruled by automated decisions and action that would be difficult for individuals to contest. 

Across both Japan and the UK, experts and members of the public called for greater public education and far clearer communication about these increasingly complex technologies. They also called for more public consultations and meaningful participation in governance and regulation. 

The report was created in collaboration with the University of Edinburgh and RIKEN research institute in Japan, and a more comprehensive version will be published in early 2022. By drawing out differences and similarities between the UK and Japan, the PATH-AI project more broadly aims to explore how different intercultural interpretations of values, such as privacy, trust and agency, can be applied to new governance frameworks around AI and other data-intensive technologies. This is essential to shaping the international landscape for AI ethics, governance, and regulation in a more inclusive and globally representative way. In the next stage of the project, the aim is to develop and pilot a methodology for an intercultural, co-designed framework for more ethical and equitable human-AI ecosystems.  

This work was supported by the Economic and Social Research Council [grant number ES/T007354/1, Principal Investigator: Dr David Leslie] and the Japan Science and Technology Agency. 

PATH-AI residency programme 

Today also marks the launch of the PATH-AI residency programme. This is commissioning artists to create new works critically engaging with intercultural ideas of privacy, agency and trust in relation to artificial intelligence (AI) and other data-driven technologies. The residency will be run in partnership with Somerset House Studios and the UAL Creative Computing Institute.

The programme is aimed at artists interested in exploring the international landscape for AI ethics, governance and regulation. Three artists will be supported to develop new works within a six-month remote residency programme, with final works presented virtually by Somerset House in 2022.  

Find out more about the programme – the deadline for applications is Monday 31 January 2022, 17:00 (GMT).  

To find out more about PATH-AI, please visit the project website and subscribe to updates. You can also follow them on Twitter @PATHAIResearch

Dr Fumi Kitagawa

Senior Lecturer in Entrepreneurship and Innovation, University of Edinburgh Business School