Guest post from Paquita de Zulueta, former Council member and Honorary Senior Clinical Lecturer at Imperial College London
On 12 May this year, the (then) Health Secretary Matt Hancock issued, with minimal publicity, a legal direction to every GP in England, instructing them to upload their patient records to a central database, giving patients just a few weeks to find out about the plans. The date for general practice data for planning and research (GPDPR) was delayed from 1 July until 1 September after the Royal College of GPs and the British Medical Association requested a delay and a public information campaign to allow the public to be properly informed and given the choice of opting out. The Government also faced a legal challenge. The GP data is to be ‘pseudo-anonimysed’, but it can be de-anonymised, particularly as NHS numbers will be included and the data will be linked to other data sets. Many GPs – the designated ‘data processors’ – are deeply concerned. In fact, many of us have a déjà vu of care.data, but alarmingly the planned mining is deeper and wider than in 2014.
So what are the ethical (and legal) issues?
Confidentiality and consent
Medical confidentiality has a long and venerable history and is a key tenet of professional codes of ethics, from the Hippocratic oath (over 2,500 years old) to the current guidance for doctors’ duties from most professional bodies such as the General Medical Council and the World Medical Association. Confidentiality is considered an essential element for gaining and sustaining trust. It is not absolute and other ethical considerations may supersede, but this does not mean it can be ignored or overridden wholesale by ill-defined utilitarian considerations of the ‘public good’.
GPDPR raises related ethical and legal issues around consent. Medical records held by GPs contain highly sensitive and personal data, with everything from medications to diagnoses that include mental illness, abortions, sexually transmitted diseases, suicide attempts, and addictions. Having practised as an NHS London GP for 35 years, I am only too aware of the very personal and intimate nature of the information that patients offer me. Often I heard the remark “I have not told this to anyone else, doctor.” These conversations were held in the context of a trusting relationship on the assumption that the information given would not be shared beyond those who are involved in their care.
Given this situation, it is undeniable that patients should be given the opportunity to decide who has access to their data and to what purposes it will be used. This shows respect for persons and for their human rights, such as protection of private and family life from an intrusive state, as highlighted in the Nuffield Council on Bioethics’ 2015 report on biological and health data. A government has a legal and moral duty to consider the impact on people’s rights and to meaningfully engage with the public when planning to process health data on a national scale.
So far, we have not seen impact studies and there has been scarce attempt to inform, let alone engage, with the public on this issue. Furthermore, consent does not mean control: it does not allow for data subjects to fully determine the aims or purposes of the use of their personal health data, or to control data access agreements, or to specify sanctions for breaches. “Research and planning purposes” can cover many activities, some of which they may not approve of.
The false dichotomy between private interest and public good
‘Good’ and ‘public benefit’ are subjective concepts and will vary according to individual perceptions and context. Private and public interest are inevitably intertwined and pitting them against each other creates a false dichotomy. For example, if patients cease to trust their clinicians or more broadly the NHS, public good will suffer. Furthermore, extensive exploration of public attitudes towards sharing medical data has found that people approve in general for their data to be used for medical research and for ‘good causes’, whether environmental, social or medical, but they do not approve of their data to be used for commercial purposes or for powerful companies to profit at society’s expense. They also want data sharing to be privacy-preserving, trustworthy, and to be able to retain some control of the purposes it is used for as well as the freedom to opt out.
Misleadingly, a message has been given that only with GPDPR will health professionals be able to share patient information in order to provide good care. But electronic systems for sharing, such as ‘the spine’, already exist. Opting out of GPDPR will not affect communication in the provision of direct patient care. Successful research studies carried out during the COVID19 pandemic such as the RECOVERY trial were conducted according to strict research ethics guidelines and with patient consent, not with a ‘data grab’.
Justice
Francis and Francis argue that focusing on the conflict between individual or group rights and the public good obscures issues of justice. They cite the use of data in addressing HIV rates and race in the US; the ensuant public health initiative was perceived as discriminatory and backfired. They propose three strategies for enhancing justice: Transparency, participation, and reciprocal benefits for those affected by data collection and its use. GPDPR arguably fails on the first two counts and possibly the third.
Data stewardship and accountability
Sarah Cheung in an insightful article
demonstrates how the ‘trade-off fallacy’ and ‘obfuscatory practices’ negate individuals’ control of the future use of their personal health data and enables widespread involvement of commercial actors in accessing and using personal data. Consumers can withhold their data from companies they do not approve of, but patients or service users cannot avoid seeking healthcare and are therefore disempowered and more vulnerable to exploitation.
The Ada Lovelace Institute’s Rethinking data programme offers us another lens: to view data as part of the ‘commons’ – a public good to be shared within a framework that prevents a free for all. The late economist Elinor Ostrom’s 8 principles for managing the commons are adapted for the purpose of data sharing. These include boundary setting, localism (with rules dictated by local people according to their needs), participatory decision making, accountability, effective sanctions, and accessible conflict resolution. Data differs from concrete goods as it is ‘intangible’ and can be reused limitless times, making it both more valuable and subject to exploitation.
Commercial interests?
Health data can be viewed as valuable for the public good but also to big business. The UK’s NHS data has been valued at around £10 billion. We know that the NHS has links with commercial technology giants and the corporate sector. For example, NHSX holds the extensive NHS COVID-19 datastore which has contracts with Palantir, a US company known more for supporting spy agencies, militaries, and border forces, as well as with Amazon, Google, and others to provide data analysis and management. How aware is the public of this and how comfortable are they for such companies to have access to their medical data, even with safeguards?
Broader considerations
By increasing our reliance on digital infrastructures – ‘Digi Health’ – for healthcare delivery, we need to ask ourselves what kind of healthcare will we have and is it what we actually want? Besides, we already have the data on how we can improve the wellbeing and health of the population: by tackling the social determinants of health. These are more important than healthcare or lifestyle choices in influencing health and for creating health inequalities as the Marmot report and the pandemic have convincingly shown us.
Conclusion
Proper arrangements should be made to inform, engage, and seek the valid consent of the public to gift their medical information for beneficial purposes which they support. After all, ‘data’ comes from the Latin verb ‘dare’ – to give. Those who offer valuable gifts should be honoured and treated with respect.