Media Coverage

Kingsley Hayes explores the impact of NHS data sharing on privacy rights

Head of Data Breach, Kingsley Hayes, explores the impact of NHS data sharing on privacy rights, in AI Business.

Kingsley’s article was published AI Business, 16 September 2021, and can be found here.

The enduring link between health and data has saved thousands of lives during the pandemic. But the history of NHS data usage is more problematic, particularly data sharing with third parties and data protection. In May, the government announced The General Practice Data for Planning and Research scheme: for everyone registered in England, GP health data would be made available for healthcare research and planning. To protect their privacy, individuals’ identities would be partially removed. 

But privacy campaigners cautioned that the removal of identities could be reversed. Within weeks, an online campaign encouraging people to opt-out of NHS data-sharing had gained nearly 1.4m supporters. Following the backlash against making patient data available to private companies, the plan was put on hold. 

The NHS has a chequered history in data sharing and protection. In 2016, the Royal Free NHS Foundation Trust was censured by the Information Commission (ICO) in relation to data on 1.6m people which the Trust handed over to Google’s DeepMind division (an AI company) to enhance its machine learning capability. The ICO ruled that the Royal Free failed to protect patients’ privacy and that it was “inexcusable” that they had not been informed. 

Since the GDPR became effective in 2018, there has been significant concern about NHS Digital’s failure to secure appropriate consents to data sharing in a flawed IT project. Currently, the NHS is working on AI projects via NHSX, which deploy machine learning in R&D projects, raising questions over what people know about how their data is being used.  

This year, Big Brother Watch reported that NHS Digital’s management of Covid vaccination status data had failed to deliver basic safeguards, so that information could be exploited by insurers, companies, employers and scammers. Once it was revealed that people’s vaccination status was being leaked, NHS Digital altered its vaccination booking website.

Potential misuse is a big issue when the NHS shares confidential patient data with third-party organisations. Should that data become part of an AI project and what happens to it?  Critically, what do data subjects know about the consent that they provide for processing that data and where it is used? 

The NHS has nearly 1,000 external suppliers, while the number of NHS Digital supply chain partners is not itemized. NHS Digital simply states: “Our supply chain partners are fundamental to our ongoing success, creating significant value through the delivery of new thinking and innovative solutions. Through the deployment of Strategic Supplier Relationship Management (SSRM) we are focused on creating an effective and collaborative relationship with our most important suppliers, creating additional value and innovation that goes beyond our contracts.” It adds the following on data usage: ‘We ensure that external organisations can access the information they need to improve outcomes, and the public are confident that their data will be stored safely by NHS Digital.’

Public confidence in NHS Digital’s commercial relationships is open to question. The Data Protection Act of 2018 and the GDPR are designed to ensure that an individual data subject – the person giving consent – should be fully appraised of all of uses of that data and for how long it will be stored. 

Post-GDPR, uncertainty exists concerning AI projects: how often data is used in the machine learning process and where it ends up. The AI designers of machine learning programmes carefully guard information about the algorithms underpinning their programmes. Individual data subjects do not know what happens to their data: there is very little transparency. Individuals are rightly concerned that once they have given consent, can they withdraw it and remove the data from the AI tank? If not, then it does not accord with GDPR principles. 

The EU is now considering GDPR-like regulation for AI, but the UK seems unlikely to follow. Future GDPR and data protection rights will probably evolve more by judicial intervention in the UK than by additional regulation. Ultimately, divergence from the EU will result in greater judicial divergence from EU law.   

On future relationships of NHS Digital with third-party companies, there is cause for concern. Given its track record, state-owned entities like the NHS simply do not seem to have the technical capabilities to understand what AI projects do. The NHS buys an outside resource which sometimes has its own agenda. Notably, some US tech companies operating in the NHS market have a very well-established agenda around the provision of current and future services, and how they can monetise them. 

NHS Digital’s technical capability is not in doubt, but its understanding of some tech companies with which they do business may well be. Their core objectives do not easily align with those of the NHS. Although the NHS has sophisticated diagnostic processes to facilitate early diagnosis of difficult conditions, essential changes need to be made. In some areas, it is floundering in terms of technical systems: the more the NHS relies on outside agencies, the greater the risk that it will not have the appropriate level of compliance, particularly where no alignment of interests exists.

Regrettably, significant data misuse seems inevitable, as does ensuing litigation. Increased consumer understanding and greater transparency in how individuals’ data is managed are inevitable. Presently, few people appreciate the commercial value of their personal and medical data. In some instances, it’s probably worth more than gold. 

Greater awareness will be achieved by investigation into what tech and AI products are doing. Access to information will then enable people to understand and hopefully allow them to regain some control. Inevitably, this will provoke litigation against some of the organisations with which the NHS had commercial relationships: their financial motives will lead to actions against them. Implementation and processing of data will be key. People will not sue the NHS for how they endeavour to improve their services and handle their data. Instead, those that handle that data for them will be in line of sight.

Maltin PR

Recent Posts

Latest Data Breach Round-Up – June 2024

In our regular update, we provide a roundup of some of the data breaches and… Read More

6 months ago

Join our MOVEit/ Zellis Data Breach Action

We have launched a group action against MOVEit/Zellis. Group actions can be a powerful tool… Read More

6 months ago

One year on – the extent of the MOVEit data hack is just becoming clear

The number of organisations affected by the MoveIt Data Breach is still rising, despite the… Read More

6 months ago

Join our 23andMe Data Breach Action

We have launched a group action against 23andMe. Group actions can be a powerful tool… Read More

6 months ago

ICO and Canadian counterpart to investigate 23andMe data breach

The Information Commissioner’s Office (ICO) has launched a joint investigation into the 23andMe data breach… Read More

6 months ago

Join Our Capita Data Breach Action

We have launched a group action against Capita. Group actions can be a powerful tool… Read More

6 months ago