Media Coverage

Kingsley Hayes discusses the impact of NHS data sharing

Head of Data Breach, Kingsley Hayes, explores the impact of NHS data sharing on privacy rights, in The Barrister.

Kingsley’s article was published, in print, in this quarter’s edition of The Barrister.

Among the many lessons to be learned from the Covid-19 pandemic is the inextricable link between health and data: effective use of the latter has undoubtedly helped save many lives over the past year. However, long-term use of NHS data is more contentious, not least the issue of data sharing with third parties and protection of that data. In May, The General Practice Data for Planning and Research scheme was announced by the government, under which GP health data for everyone registered in England would be made available to researchers and companies for healthcare research and planning, with people’s identities partially removed.

But according to privacy campaigners, the process to remove identities could be reversed, which led to a widespread online campaign encouraging people to opt out. In August, the Observer revealed that nearly 1.4 million people had opted out of NHS data-sharing in May and June, following a huge backlash against the plan to make patient data available to private companies. As a result, the plan has now been put on hold with no new implementation date yet fixed.

Privacy campaigners can also point to the NHS having a chequered history in data sharing and protection. In 2016, the UK’s Information Commission (ICO) censured the Royal Free NHS Foundation Trust in relation to data on 1.6 million people, which it handed over to Google’s DeepMind division (an AI company) during the early stages of an app test to enhance their machine learning capability. The ICO ruled that the Royal Free did not do enough to protect the privacy of patients, and that it was “inexcusable” that they had not been told about what had been happening to their data. The information commissioner, Elizabeth Denham, said that attempts to make creative use of data had to be carefully managed. “The price of innovation does not need to be the erosion of fundamental privacy rights,” she added.

Since the GDPR came into force in May 2018, NHS Digital has had further significant issues securing the appropriate consents to data record sharing in an IT project that had glaring failures. Meanwhile, the NHS is working on AI projects via NHSX to use machine learning in research and development projects. Again, questions exist around data transparency and public consent with regards to personal data use within that project.

In May, Big Brother Watch reported that NHS Digital’s management of Covid vaccination status data had failed to deliver even basic safeguards, which could lead to information being exploited by insurers, companies, employers or even scammers looking to defraud individuals. Director of Big Brother Watch, Silkie Carlo, said: “This is a seriously shocking failure to protect patients’ medical confidentiality at a time when it could not be more important. This online system has left the population’s Covid vaccine statuses exposed to absolutely anyone to pry into. Robust protections must be put in place immediately and an urgent investigation should be opened to establish how such basic privacy protections could be missing from one of the most sensitive health databases in the country.” After it was revealed that the system leaked people’s vaccination status, NHS Digital then altered its Covid vaccination booking website.

Potential or actual data misuse is the big issue when the NHS shares confidential patient data with a third-party organisation. If that personal data is provided as part of an overall AI project, what happens to it, where does it go, where does it sit, and how many times does it get processed? Ultimately, the key questions for the people concerned are: what does a data subject, as an individual, know about the consent they have given for the processing of that data, where it is then going to be used and how many times is it going to be used?

The number of external suppliers to the NHS is substantial: 28 million lines of picked goods are delivered to the NHS annually with consolidated orders from over 930 suppliers. Information relating to the number of supply chain partners operating with the NHS Digital Commercial team of procurement professionals is not itemized.

The NHS Digital team states: “Our supply chain partners are fundamental to our on-going success, creating significant value through the delivery of new thinking and innovative solutions. Through the deployment of Strategic Supplier Relationship Management (SSRM) we are focused on creating an effective and collaborative relationship with our most important suppliers, creating additional value and innovation that goes beyond our contracts.” It adds the following in relation to the collection and dissemination of data: “We ensure that external organisations can access the information they need to improve outcomes, and the public are confident that their data will be stored safely by NHS Digital.”

What happened with Royal Free, combined with more recent events, demonstrates that public confidence in NHS Digital’s commercial relationships with external organisations is open to question. The Data Protection Act of 2018 and the GDPR are designed to ensure that an individual data subject – the person giving consent – should be fully appraised of all of uses of that data, where that data is going to end up, how it is going to be treated, and ultimately, if it is going to be retained or disposed of.

Post-GDPR being implemented, an element of mystery still exists concerning AI projects as to how often that data is utilised in the machine learning process, and where it ultimately ends up. The overarching aspect is that the designers of AI and machine learning programmes closely guard information about how the algorithms underpinning these programmes work. Once data has been provided so that individual data subjects do not know what has happened to it, there is very little transparency in the process. Moving forward, the concern for any individual is that once they have given consent, is it possible to withdraw it and remove that data from the from the AI tank? If not, then it does not accord with the principles of GDPR and data subjects.

The EU is now looking at AI regulation comparable in many ways to GDPR. But the UK’s direction of travel appears to be that this is one area where we will not keep alignment in place. GDPR and the protection of data rights is an area where which will probably evolve more by judicial intervention than by additional regulation. Over time, UK divergence from the EU will lead to judicial divergence of laws created by the EU.

When considering the future relationships of NHS Digital with third-party companies, there is cause for concern. Based on its track record, it is reasonable to assume that state-owned entities like the NHS simply do not have the technical capabilities to understand what exactly AI projects can and will do. The NHS is buying an outside resource which, necessarily, sometimes has its own agenda. An objective look at some US tech companies operating in the NHS market reveals a very fixed, well-established agenda around the provision of services and understanding of what services will be required in the future, and how they can monetise them.

The problem lies less with the technical capability of NHS Digital, and rather more with a lack of understanding of the core objectives of some tech companies with which they are doing business. These core objectives do not necessarily align with those of the NHS. Essential process changes need to be made to the NHS but it is simultaneously floundering in terms of how to achieve that technically: the more the NHS relies on outside agencies, the greater the risk that it will not have the appropriate level of compliance, particularly where interests do not align.

Against this background, significant data misuse seems inevitable and will ultimately lead to litigation. The key driver will be consumer understanding and a demand for greater transparency in how individuals’ information and data is dealt with. At present, most people do not appreciate the value of their personal and medical data. In some instances, it’s probably worth more than gold. Over the next few years, there will be greater investigation into some of these tech and AI products.

Dissemination of such information will enable the public to understand and regain control of their personal data. It is inevitable that this will provoke litigation – not against the NHS, but against some of the organisations with which they have commercial relationships. The motives and monetary gain that is sought by third-party suppliers will lead to actions against them, and the implementation and processing of data will be key. The public will not sue the NHS for dealing with personal data when seeking to improve their services. The tech companies responsible for handing the data will be the ones in line of sight.

Maltin PR

Recent Posts

Latest Data Breach Round-Up – June 2024

In our regular update, we provide a roundup of some of the data breaches and… Read More

5 months ago

Join our MOVEit/ Zellis Data Breach Action

We have launched a group action against MOVEit/Zellis. Group actions can be a powerful tool… Read More

5 months ago

One year on – the extent of the MOVEit data hack is just becoming clear

The number of organisations affected by the MoveIt Data Breach is still rising, despite the… Read More

5 months ago

Join our 23andMe Data Breach Action

We have launched a group action against 23andMe. Group actions can be a powerful tool… Read More

5 months ago

ICO and Canadian counterpart to investigate 23andMe data breach

The Information Commissioner’s Office (ICO) has launched a joint investigation into the 23andMe data breach… Read More

5 months ago

Join Our Capita Data Breach Action

We have launched a group action against Capita. Group actions can be a powerful tool… Read More

5 months ago