Join us for our second webinar - Beyond the exit strategy: ethical uses of data-driven technology in the fight against COVID-19 - jointly organised with that Ada Lovelace Institute.
This webinar has already taken place. A recording of this webinar and summary of presentations are available below.
Friday 17 April, 12.00-12.40 (BST)
Chairs:
- Hugh Whittall, Director, Nuffield Council on Bioethics
- Carly Kind, Director, Ada Lovelace Institute
Speakers:
- Linnet Taylor, Associate Professor, Tilburg Law School, Netherlands
- Ruipeng Lei, Professor of Bioethics, School of the
Humanities and Centre for Bioethics, Huazhong University of Science and
Technology, Wuhan, China
Data-driven technologies are likely to play a vital role in
enabling societies to transition out of the COVID-19 crisis, restart
their economies and return to ‘normal’ life. Join us for a discussion on
how technologies can be utilised ethically in tackling crises, in ways
that command public trust.
Digital contact tracing is likely to be a core component of
government exit strategies out of COVID-19 lockdowns and has already
proved successful in countries such as Singapore and South Korea. As
policymakers begin to consider strategies for exiting the crisis, the
scope of contact tracing is likely to expand – from tracing the spread
of the disease for containment and treatment, to informing, restricting
or permitting individuals’ movement.
In Europe, discussion is beginning on whether governments should
expand antibody testing and develop ‘immunity certificates’ as a means
for controlling or easing social distancing measures and work
restrictions. Digital identity providers are beginning to propose
digital credentialling systems to enable key workers to establish and
verify their immunity to their employers.
These kinds of interventions present particular ethical, legal and technical challenges, including:
- How can we minimise the potential harms caused by unprecedented intrusions into our personal lives?
- What might be the unintended consequences of using people’s health and immunity status to relax restrictions for some?
- How should we demonstrate solidarity with those who remain vulnerable to infection?
- What should be the terms of any partnerships between governments and tech companies?
- Will this lead us to new norms of privacy and surveillance in the longer term?
Attendees will not be audible or visible during the webinar, but
will be able to put questions to the panel through a Q&A tool.
A recording and summary of the webinar will be available on the Nuffield
Council on Bioethics and Ada Lovelace Institute websites shortly
afterwards.
Find out more about our COVID-19 work and watch a recording of our previous webinar.
Presentations
Linnet Taylor is an Associate Professor at
Tilburg Institute for Law, Technology and Society, Tilburg University,
Netherlands working on global data justice.
We are in a situation today where there is a lot of policy-based
evidence emerging, rather than evidence-based policy. This is because we
are working in a state of profound uncertainty, with very thin
knowledge of the accuracy of the data we have to work with. Scientists
are broadly being balanced and clear in communicating these limitations
and uncertainties, but when this is translated into policy it appears to
be resulting in radical differences between different countries and
regions.
Cultural differences around the world are going to create different
types of evidence that governments are going to have to weigh. For
instance, we’re seeing already a radical difference in how effective
face masks are considered as an intervention internationally. There is
some kind of underlying truth about whether we should be using face
masks and their efficacy, but we don’t know what it is right now.
Nonetheless, it’s affecting policy. This is similarly the case in
practices of technical tooling and surveillance and the varying pace of
its adoption. Therefore when we consider data analytics and data-driven
technologies – whether immunity certificates or contact tracing – this
kind of discourse and analysis around ethics, civil and political rights
is hugely important.
The global pandemic is not a singular problem that can be solved, but
a state of the world which needs to be addressed in a continuing way.
This means we are talking about governance - governance
of an ongoing pandemic and our response to it. If we want this
governance to operate through technology, we have to be aware of the
existing limitations of democratic accountability in technology. We want
to avoid a hackathon approach to technologies that deal with our health
and our civil and political liberties, particularly in the context of
information uncertainty.
If we are to have government through technology, it must be accountable. To do that we need:
- Honesty about the gaps in the data, in an ongoing way: being
clear on what we don’t know and having transparency about mechanisms
for ensuring systems aren’t being built on inaccurate data, and can
adapt as our understanding of the data and its accuracy may change.
- Guarding against function creep: we need to ensure
our actions are relevant to the current state of affairs in terms of
public health and safety during COVID-19, and update to match as this
changes over time to prevent unnecessary ongoing measures or security
theatre.
- Structured protections against irresponsible technology that
go beyond GDPR and personal data, and see checks and balances via a
meaningful role for civil society organisations in protecting privacy
and critical civil rights, as well as ensuring representation and
diversity, globally, of those involved in the design and development of
the technology.
- Balance between centralisation and decentralisation:
a key tension is that good privacy technology is decentralised, whilst
good government sees centralised information that enables judicious
action; we will need skills and organisations to broker these
perspectives.
- No tech solution should be seen as a political solution:
we will remain in a state of vulnerability and the idea that tech can
truly and wholly protect us, is something more dangerous than bad
politics or bad government.
Ruipeng Lei is
Professor of Bioethics at the School of the Humanities and Centre for
Bioethics, Huazhong University of Science and Technology, Wuhan, China.
The health code system in China
On 6 February 2020 the leader of Hangzhou city, in Zhejiang province,
proposed that, in order to restart work and enterprises, they should
play to the advantages of the digital economy and establish a unified
digital declaration platform, for data sharing and the creation of
personal health codes. Soon after this intervention, CCTV news reported
on a collaboration between Alibaba and local government for the Zhejiang
health code product – a new type of public-private cooperation. Other
local provinces quickly sought to replicate it.
The health code system in China requires users to declare information
including their name, fax, mobile phone number, ID number, email
address, activity tracking, health information and contact history. In
addition to this information recorded by users, the health code system
is also connected with national civil aviation, railway and highway
data, as well as local bus and subway data. Telecoms, payment and
banking data is also available to the system. This combined data is then
used for real-time comparison and analysis of data to cross-verify
user-declared data and accurately map the movement trajectory of
citizens and identify high-risk groups.
From this, the health codes are generated giving different colours.
In Wuhan they have three code colours: green indicating ability to
travel freely, yellow requiring self-quarantine for one week and red
requiring self-quarantine for two. However, the health code system is
not unified across the country, and so this will work differently in
different areas. Some provinces allow a one-day health code green pass
following a temperature check, for instance.
There is a legal basis for this health code system in China.
Regulations on prevention and control of infectious diseases and the
regulations on health emergencies stipulate that in the process of
epidemic prevention control, the government and health institutions have
the power to collect personal information for the purpose of epidemic
prevention and control. Individuals have the obligation to report
personal information such as ID card, address, health status and contact
history. Relevant departments and agencies have the legal basis to
change the purpose of information use.
We appear to see success, not just in China, but in East and
Southeast Asia, with these kinds of mobile apps combined with intensive
testing programs. Standard testing and contact tracing have challenges
in a context where an estimated 50% of all disease transmission is
happening during early infection stages and before symptoms start. These
emerging and data-driven technologies have merits.
Ethical considerations
There are ethical issues to consider with these technologies.
Informed consent and privacy are serious concerns of the Chinese people.
According to a survey of 16 health codes, covering 14 provinces and
cities, when applying for a health code there no user service or privacy
agreement with the app. However, in the national health code platform
the applicant is required to consent to the user service agreement and
privacy policy. This specifies what data will be collected, that it will
only be used for recording users health information for epidemic
prevention, and declares it will protect personal information and not
permit the use of the information in the health code app to invade
privacy.
However, it has also been stated that the Shanghai health code system
will be used to provide data services for work, life and business.
Alipay have said the health code will be formally associated with
electronic social security cards and health cards, and can be used for
hospital and insurance payments. This raises concerns about the invasion
of privacy.
There is major potential benefit to society through enabling normal
life to resume safely, and socially responsibly. But there is also scope
for improvement in these technologies to implement informed consent and
protect privacy. In China it is considered that everybody has a right
to be protected and treated when being infected and, in turn, also has
the obligation to protect others, including providing your health, and
other related information to the agency for preventing and controlling
the epidemic.
Now the challenge is how to improve the health code systems and
achieve a balance between personal interest and societal good. The state
should accelerate the formulation and implementation of technical
standards to unify and standardise data collection, strengthen scrutiny
of data management, prevent data leakage and publicise relevant
documentation in a timely manner. Each health code operation institution
should improve the user agreement and privacy policy to guarantee the
rights of users to be informed about their data use, avoid data abuse.
This is an urgent task in China, because the technologies are now widely
in use. After the epidemic is over, data deletion should be
established. If it is really necessary to continue using relevant data,
the purpose of that use should be clearly defined and the citizen’s
authorisation should be obtained.
Questions
Participants submitted 85 questions for the panel. We didn’t get to all of them, but share some key highlights:
Is one of the real risks
that the governance systems may not be able to avoid function
creep and the continued use of emergency data and infrastructure going
forward?
There are provisions for deleting personal data when no longer
necessary within GDPR. However, in addition to concern about
infractions, there is also no provision for anonymised data.
Technically, that can be kept forever. Even though, as aggregated data,
it may say things about groups, neighbourhoods, a particular ethnicity,
country, gender, poverty status or activities. This is something to be
cautious about.
Chinese smartphone penetration is at around 50%. How do these
systems deal with those who don’t own or use smartphones? What do we
know about its effectiveness?
At this stage we’re very early, and so there has not been the
opportunity for thorough investigations. The variation of health code
systems by city/province allows for some customisation to that region’s
needs. There is a non-electronic alternative: a local paper certificate
with an official stamp, that relies on daily community reporting
systems. The level of community infrastructure in China is different
from many other nations – for instance, whilst there are residential
community health reporting systems in place, there is also often ones
within universities or other communities, offering many ways to interact
beyond the mobile apps.
How can we build public trust in the technologies, in the government’s use of data, in health data use?
Trust doesn’t come without trustworthiness. We’re asking tech to be
part of government, and we’re asking government to make decisions about
tech on our behalf in a way that hasn’t happened in recent history. We
should therefore be looking to apply governmental accountability to the
tech sector: tech that’s part of government and governance needs to be
under democratic control. The Ada Lovelace Institute report is a great
starting point on this thinking.
If an immense amount of value is created through data
collection, it could become private value or public value. How do we
ensure it’s the one we want?
There is not a single answer to any of these questions – they are
contextual. We need meaningful structures for scrutiny of decision
making around these problems. We need new governmental infrastructure to
establish the new forms and levels of scrutiny, including links to
civil society that don’t currently exist, to tackle these questions and
ensure that they are, in fact, the right questions.
Commercial use of this personal data is also a concern in China. The
national platform, Alipay and WeChat all use the health code system –
it’s a new kind of public-private cooperation. We should take into
account how to protect personal data and privacy from future commercial
use. Whilst some dilemmas and difficulties might be overcome by
technology itself, we cannot rely on technical fixes to social problems
and must be alert to them. We need transdisciplinary experts to
establish the accountability mechanisms for oversight of the use of
these technologies.
Share