We all need different kinds of medical care at different points in our lives. When we interact with our GPs and healthcare providers, we hope that our doctors and nurses know everything they need to know to help us get better. We want to get the best possible care and we want the experience to be as painless (in all senses) as possible. Better data sharing between organisations and people who are providing medical care, and better use of health data—whether in the private or public sector—is going to be a necessary part of making this happen.
Google’s DeepMind announced the launch of DeepMind Health in early 2016, aiming to improve direct patient care using data. There’s positives in this: the NHS wants to improve patient care and keep people healthy, and DeepMind are very good at using and understanding the potential of data. But there’s also good reason to be cautious: there’s already suspicion around data projects that involve use of our medical records as a result of other flawed initiatives, and lack of trust that what’s being done with our data is secure, ethical, and ultimately for our benefit.
Over the last week, Google’s collaboration with the Royal Free NHS Trust, which has involved providing DeepMind with access to updating and historical healthcare data on 1.6 million UK patients, has been making headlines of the wrong kind. Their data sharing agreement with Royal Free as part of DeepMind’s development of its service, Streams, was leaked to the public. While focused on Streams, an app helping staff monitor patients with kidney disease, the data sharing agreement gave DeepMind access to a wider pool of updating and historic hospital data than seemed necessary.
The potential benefits from this collaboration (improving direct patient care, specifically around Streams) remain persuasive. What has concerned people, and fueled suspicions, is the lack of transparency around how their health data is being shared, with whom, for what purpose, and with what impact. This leads to a breakdown in trust, ultimately stifling of positive uses of our health data that we all want to see.
Maintaining trust in our healthcare providers
For healthcare systems to function, it’s essential that patients trust doctors with their confidential information. Current policy around data spells out some key principles, both on the rights of individuals and the duties of organisations. The NHS Constitution outlines the right of patients to privacy, to be informed about how their information is being used and to opt out if they wish. The Caldicott Principles, which were commissioned in 1997 due to concern about how information technology was affecting the way patients’ information was used, set out the responsibilities of organisations. They include “justify[ing] the purpose” of transferring data and “us[ing] the minimum necessary patient-identifiable data possible.”
Critics of the data sharing agreement between DeepMind and Royal Free have argued that it fails to meet both of these conditions. Specifically, they say that the purpose of the deal was not clearly defined—which raises concerns about what the data will be used for, not least given that Google has access to other data with which NHS data could be combined—and that patients affected were not sufficiently informed to be able to opt out. DeepMind and Royal Free have countered these claims, arguing that the agreement followed NHS information governance processes, connections to the information are encrypted, and patients can contact the Trust to opt out.
What is clear is that this situation has been arrived at as a result of a lack of transparency. Trust is an essential component of better data sharing, and the demand for it in healthcare is acute. As Dame Caldicott said in a King’s Fund interview published on Tuesday.
“If the public is willing to trust health and care services with its data, there can be huge benefits for everyone… But there is little public awareness of the way that information is shared, and that trust has not yet been earned.”
Openness about personal data can create trust
The ODI has been exploring openness and open data as a mechanism for improving trust in how data that is about us is used. Our personal data design principles encourage organisations collecting and processing personal data to be open about what they are doing, how data will be kept secure, and how people might opt out of having their personal data used. Transparency around personal data processing and data portability are concepts at the core of the European Union’s General Data Protection Regulation (GDPR), which will be in force within the next two years.
DeepMind almost certainly have the expertise and tools to store patient data securely, and treat its use with care. They outline some of their data security and privacy intentions on their website, to help patients be “certain that their health data is handled with the utmost care and respect at all times.” They have a review board scrutinising their work every quarter.
The problem is, people don’t like surprises. And right now, people don’t know whether they can trust DeepMind. A key part of building trust is showing people how you are storing, using, and analysing their personal data in practice.
Open data sharing by design
For DeepMind and Royal Free’s collaboration, being open about the nature of the project including data security and privacy, might have changed the debate we’re now having. Open by design might include proactively publishing information about:
- the nature of the data being shared with DeepMind (both going forward and historic data)
- how many people in DeepMind have access to it, the internal processes for keeping the data secure and accountability mechanisms
- restrictions on how DeepMind can use that data
- what happens to the data at the end of the data sharing agreement
- how people can get in touch to request deletion of their data
- their quarterly health review board meetings
- their data audit processes, and procedures in the case of [and any instances of] personal data leaks
We now know lots of this information as a result of the leaking of the data sharing agreement. How different might the story have been if transparency was routine, and there were mechanisms for feedback?
Royal Free could openly publish data about the data sharing agreements they have in place with other organisations. They could create an open call for key challenges they face where data might help, supported by an open tendering process. These mechanisms would help other organisations, researchers, and healthcare providers to see where they might be able to help. The NHS information governance department could make open publication of information about personal data sharing mandatory for any data sharing to take place.
If we are to realise the potential of data to transform healthcare, or other sensitive fields, we must bridge this data-trust deficit. Designing data projects with openness at their core is going to be fundamental to building trust.
Ellen Broad, Head of Policy, and Tom Sasse, Open Data Institute.
Competing interests: None declared.