When London’s Royal Free NHS Trust signed on the dotted line with Google DeepMind to build an innovative kidney-monitoring app in 2016, it’s safe to assume they didn’t see themselves being handed a public ticking off by the UK Information Commissioner (ICO) less than a year later.
Streams sounded impressive. It could deliver alerts to doctors on the condition of patients, even spot people needing treatment on the basis of anomalous results that would normally take days to pick up through established processes.
Crucially, because it looked at lots of background patient data, it could weave its big data magic to spot the sort of patterns that would aid more rapid clinical decision making.
Despite this well-intentioned ambition, we now surmise from the ICO’s formal rebuke that the project quickly became enveloped by a haze in which patient privacy was put at risk to serve the logic of DeepMind’s development ethos.
From the outset the wording of the agreement between the two parties was weak because it gave DeepMind too much latitude to process data as they pleased, said the ICO.
On that score, the trust completely failed to explain why it was necessary for the project to access 1.6m patient records stretching back several years, nor why patients were left in the dark about how these records were being used to set up and test Streams.
One might assume from this that the ICO would eager to hand out a verbal lashing, and yet the written judgement chooses its words rather carefully.
Wrote Elizabeth Denham, Information Commissioner:
We’ve asked the trust to commit to making changes that will address those shortcomings, and their co-operation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used.
This is fairly mild medicine by ICO standards, doubtless swayed by the fact it viewed the offence as inadvertent and careless rather than downright incompetent. The trust will also be able to continue the development of Streams.
Public-sector organisations hoping to follow in the Royal Free’s footsteps by partnering in similar projects will pay close attention to the judgement, aware that the trust has done them a favour by being the first to fall on its face.
With everyone keen to move on, uncomfortable questions linger. The first is simply why it took a third party reporting its concerns for the ICO to investigate what was going on in the first place. It’s not as if other organisations hadn’t raised worries of their own by the time the National Data Guardian (NDG) became involved.
Similarly unsettling is that both Royal Free and DeepMind went out of their way to reassure critics stating, for example, that data was encrypted at all times, remained in an independent UK datacentre, and met every public-sector requirement for good data governance.
But here we are, only months later, and the ICO has confirmed that the project broke the Data Protection Act (DPA) on several counts, with the trust at fault for allowing this to happen.
It’s hard to escape the feeling that without more comprehensive oversight, the NHS is making this up as it goes along. Writing watertight agreements with big data companies is hard enough, let alone ensuring its conditions are met.
Apps such as Streams hold great promise, but the larger risks in these kinds of projects seem to be overwhelmingly on the public-sector side. We must hope there will be no catastrophic “learning experiences” down the line.