Technology giants want access to public-health records in return for developing life-saving apps in hospitals. But is that a fair trade?
A frank apology can go a long way towards rebuilding trust after a media storm.
DeepMind, the artificial intelligence company owned by Google, appears to have learned that lesson.
âWe are sorry that we fell short when our work in health initially began, and weâll keep listening and learning about how to get better at this,â a DeepMind Health spokesperson told The Medical Republic.
The confession follows a damning finding by the UK privacy watchdog last month.
A year-long investigation by the Information Commissionerâs Office found that a UK public hospital, Royal Free NHS Foundation Trust (NHS Trust), had acted illegally by giving DeepMind access to 1.6 million identifiable patient records without patient consent.
The personal records were used to test a mobile app, Streams, for the detection of acute kidney injury in UK hospitals.
DeepMindâs copy of the data remained under the strict control of the NHS Trust and the app proved a success. But concerns were raised over the amount of access DeepMind was given to historical and current identifiable public records. The five-yearsâ worth of data included sensitive and potentially embarrassing information on HIV-status, abortions and drug overdoses.
While this information could sometimes be relevant to the treatment of acute kidney injury, and was held securely, it was not necessary to completely override so many patientsâ rights to privacy in the testing phase, Commissioner Elizabeth Denham said.
âNHS organisations, perhaps more than any sector, need to remember that we are talking about the medical information of real patients,â she said.
This recent skirmish in the UK hits at the heart of a debate occurring globally. Governments, including Australiaâs, are anxious to use big data as a force for good. But bungled efforts to share personal health data with researchers and the private sector have attracted harsh criticisms.
At the core of this moral panic around data privacy, is the question: who can we really trust with our intimate health records?
The good guys?
Googleâs artificial intelligence business currently has access to millions of highly sensitive health data points linked to individuals. But DeepMind argues it can be trusted with this information and the benefits of granting the access outweigh the possible negatives.
The company makes a good case. DeepMind has taken a number of steps to open its operations up to public scrutiny. It has done this despite the UK commissionerâs ruling only applying to the NHS Trust project.
DeepMind has proactively published its service agreements with the NHS hospitals in full online, which is unheard of among software providers.
And long before any media attention, the company established a panel of unpaid independent reviewers, including the editor-in-chief at The Lancet, to critically examine the companyâs practices.
The first annual report was published online in July. Reviewers expressed concerns about the companyâs lack of public engagement, particularly in relation to the perception that data processed by DeepMind could be shared with Google (it canât be).
DeepMind responded to the reviewersâ recommendations by recruiting BMJ patient editor, Paul Buchanan, and by hosting two patient-consultation events.
DeepMind is also drawing on its cyber security smarts to engineer a real-time auditing system, which would keep an unalterable digital ledger of what purposes data has been used for, similar to blockchain. But making such a system âprovably trustworthyâ such that no other software is able to secretly interact with data in the background is a significant technical challenge, DeepMind admits.
In a marked departure from most tech giants, which remain highly secretive about their use of personal data, the DeepMind media team provided generous in responses to The Medical Republicâs questions and seemed very keen to demonstrate its commitment to transparency.
If actions speak louder than words, the Streams app is some proof that DeepMind can contribute to public health in a positive way. The app has only been rolled out in one hospital so far. But each day, the app analyses more than 2,000 blood tests and alerts doctors to an average of 11 patients at risk of acute kidney injury. Some staff said the app saved them up to two hours a day.
Right now, many hospitals depend on pagers and phone calls, which can be slow and unreliable, for information on any escalation of a patientâs condition. The Streams app overcomes delays by analysing a blood test and automatically sending an alert, similar to a breaking-news notification, directly to a specialistâs iPhone, along with other clinical information relevant to making a diagnosis of acute kidney injury.
DeepMind wonât know exactly how successful the app has been until the peer-reviewed service evaluation has been concluded later this year. In line with the companyâs desire to be seen as transparent, the methodology for this study has already been published and peer reviewed online.
DeepMind owns the intellectual property behind the Streams app, which was originally developed using synthetic data. The company wants to build more capability into the technology, including detection of sepsis and diabetes complications. The app currently doesnât include any artificial intelligence, but the company has a set of longer-term AI research projects.
Surprisingly, DeepMind has yet to delete the massive amount of real patient data illegally provided by the NHS Trust for the testing of the app. This is because the NHS Trust has not issued an instruction to DeepMind to destroy the information, a spokesperson said: âIf and when weâre ever requested to delete data by a trust, we comply.â
The company feels it has been unfairly singled out for criticism due to its relationship with Google, which bought the company for ÂŁ400 million in 2014. âGo ahead and talk to any of the other trusts and look at their agreements, look at any of the other software provider agreements,â DeepMindâs co-founder Mustafa Suleyman said last year. âWhat weâre doing is entirely standard and entirely conventional.â
An ill-fated decision
The NHS Trustâs ill-fated decision to hand over millions of medical records without patient consent to a search engine monopoly could be put down to inexperience. In its well-intentioned haste to deploy DeepMindâs expertise, the NHS Trust simply overlooked the looming public-relations nightmare.
But history shows this incident is not a singular âwhoopsâ moment for the NHS Trust, as some claim. Itâs the second of two major data-sharing catastrophes to shake public confidence in recent years.
The UK governmentâs plan to centralise GP patient records, called care.data, was scrapped last year after concerns were raised over the use of NHS data by third parties, including insurers. Over a million patients had opted out before the project was shut down.
During the uproar, a worrying number of NHS data-sharing arrangements came to light, including cloud storage of NHS data using Googleâs BigQuery, and the use of partially anonymised data for insurance purposes.
These were not isolated incidents of commercial entities being fed public health data; the NHS had been slowly privatising its data analytics for many years, said Eerke Boiten, a cyber security expert at the University of Kent in the UK.
Companies such as CSL-UK, Northgate and McKinsey were regularly commissioned to process NHS data. âThis is probably a consequence of the bad history of the NHSâs large in-house IT projects,â Mr Boiten said.
The data analytics, in itself, was not the problem. This issue was the NHS had very little concern about companies using data for their own commercial purposes or selling data to other businesses as long as some claim of âanonymityâ or âpseudonymityâ was made, Mr Boiten said.
âThe public is now seeing what third parties are already doing with [NHS data], and they donât like it,â he said.
The care.data project was thrown out following a review by the UKâs National Data Guardian, Dame Fiona Caldicott. She recommended the NHS completely exclude the use of medical data for marketing and insurance purposes and introduce a simplified opt-out system.
In a twist of fate, the NHS data chief who presided over the care.data disaster, Tim Kelsey, is now the chief executive of the Australian Digital Health Agency. Mr Kelsey also previously founded one of the health analytics companies that processed NHS data, called Dr Foster.
The backdrop to all this is a long-standing, secretive global trade in longitudinal patient records, including from the UK and Australia, by data companies such as QuintilesIMS.
Detailed information about GP consultations and prescriptions are passed on to QuintilesIMS, which then sells the data to pharmaceutical companies. Patients are de-identified but doctors usually arenât, meaning that drug companies could use the data to attempt to influence prescribing habits of individual doctors.
âQuintilesIMS has individual files on more than half a billion patients worldwide, but the company is little understood, even by doctors, nurses, and people in the healthcare systemâ said Adam Tanner, a fellow at Harvard University and the author of Our bodies, our data: How companies make billions selling our medical records.
The Medical Republic asked QuintilesIMS what data it had on Australian patients and to whom it was selling it, but a media spokesperson said the company could not respond âdue to scheduling conflicts and vacation schedulesâ.
âGenerally, these companies that know so much about us are very protective and secretive about their own operations,â Mr Tanner said.
All this puts the recent turmoil over DeepMindâs dealings with the NHS into context, and adds legitimacy to concerns private-sector access to medical records is not being adequately regulated.
What are we scared of?
That many are freaked out about medical records being handed to Google â a company known for its global surveillance of search engine users â is understandable, but is it rational?
The most common concern is that Google might find enough wiggle room in the contract terms with the NHS to use the public data for other, more sinister purposes. The initial NHS data sharing agreement left the use of data by Google largely unconstrained but the revised contract was more stringent.
âThe question is, of course, whether we trust Google to stick to these policies,â Mr Boiten said.
The auditing by the NHS might be sufficient to prevent more blatant abuses by Google, he said, but UK privacy laws were fairly lax about the sharing of anonymous health data.
DeepMind might find ways to generate data that was somewhat personal but not personal enough for crude data protection laws to be able to protect it, Mr Boiten said. This could be problematic from an ethical standpoint if patients had not consented to their data being used by a third party, as well as presenting a risk of individual privacy breaches.
However, the EUâs sweeping new General Data Protection Regulation, which will take effect in 2018, including in a post-Brexit England, will severely curtail the capacity for businesses to ignore privacy considerations.
Also of concern is the lack of self-regulation that Google appears to have around its use of personal data, particularly in AI. As part of its acquisition of DeepMind three years ago, Google promised to set up an ethics board. To date, DeepMind has refused to disclose the board membership or any details of its discussions. The Medical Republic asked DeepMind about this mysterious ethics board but did not receive a response.
The âPartnership on Artificial Intelligence to Benefit People and Societyâ between Google, Facebook, Amazon, IBM, Apple and Microsoft, announced last year, also seems, so far, to be little more than window dressing. The stated aims of the partnership are to improve public understanding and set standards for researchers, but the group has not made any noise since May this year if its website is anything to go by.
But an even greater worry than privacy breaches may be the ability of the digital monopolies to lock up the health analytics market, Ross Anderson, a professor of security engineering at the University of Cambridge in the UK, said.
Just as pharmaceutical monopolies gamed the system to inflate the cost of drugs, digital giants given too much control over health data might bend the system in their favour, he said.
Some of the tactics used by pharmaceutical companies to force governments to pay higher prices for drugs included âorganising groups of sufferers to holler in the press until ministers give inâ and âbuilding up dominant positions in particular diseases or neighbourhoodsâ, Professor Anderson said.
âWeâre starting to see the emergence of a novel type of market power, namely information. The IT industry is rife with monopolies â IBM, then Microsoft, now Google and Facebook â and the ways in which they rig markets are simply beyond the experience of health officials, who have a difficult enough time dealing with drug companies.â
Regulators might be able to prevent excesses, but the actions by the UK Information Commissionerâs Office were not promising, Professor Anderson said. The Commissioner found that it was against the law for the Royal Free Hospital to have given a more than a million records to Google. âBut was Google fined, or required to destroy the records? âNo. Only the hospital was censured. That does not bode well for the future,â he said.
âIf health regulators arenât willing to stand up to powerful companies with lots of clout, then taxpayer-funded health systems will be increasingly screwed over; and if slices of health regulation are left in the hand of privacy regulators who are deliberately given few powers and led by non-combative political appointees, that is worrying.â
It was difficult for the media and government to hold tech giants to account when their operations were so secretive, Mr Tanner said.
Tech giants knew that the health-data market was wide open at the moment, and that the company to get there first would be the âwinner that takes allâ, he said.
In this latest âspace raceâ, these companies scrupulously guarded their secrets to protect competitive interests, which was a significant barrier to public scrutiny. This paranoia often became over-exaggerated, Mr Tanner said, particularly in the case of Facebook, which required visitors to sign a non-disclosure agreement before entering the building â a practice that was not required by many high-security institutions, including the White House and the Kremlin.
Finally, Google also does not have a good track record when it comes to anti-competitive behaviour. Just last month, the company was fined âŹ2.42 billion by the European Commission for illegally gaining an advantage over its shopping rivals. A recent investigation by The Wall Street Journal found that Google was paying professors who wrote papers declaring, among other things, that the company was not misusing its market power to crush competitors.
Thereâs no such thing as anonymous
The Australian government is similarly possessed by the desire to put patient data in the hands of researchers who can make the best use of it. In 2015, Prime Minister Malcolm Turnbull issued a Public Data Policy Statement, which pledged to share public data with the private and research sectors. âPublishing, linking and sharing data can create opportunities that neither government nor business can currently envisage,â Mr Turnbull said.
This glorious vision experienced some major hiccups, however. Shortly after the government posted a billion lines of apparently de-identified, linked MBS/PBS data online last year, cyber-security specialists figured out how to decrypt the service provider ID numbers.
They did not expose the patient ID numbers, but the government quickly took the data offline regardless, and launched an independent audit. âThis dataset will only be restored when concerns about its potential vulnerabilities are resolved,â the Department of Health said.
The policy of trying to release personal health data was âoverly optimisticâ and âprobably ill-founded in the first placeâ, said Dr Vanessa Teague, the researcher at the University of Melbourne who alerted the government to the security issue, along with her colleague Dr Chris Culnane.
âIf there had been more tech people involved there may have been better protections,â she said.
The Medical Republic asked the Department of Health whether it was now consulting with cyber-security experts to ensure that publicly available data was properly de-identified, but did not receive a response.
In an attempt to crack down on re-identification, the federal Attorney-General George Brandis proposed an amendment to the Privacy Act that would make it a criminal offence to re-identify public data in September last year. But this went against the common practice of encouraging experts to test data security and alert groups to areas of vulnerability, Dr Teague said.
With large volumes of health data, the question was often not whether anonymous data points could be tracked back to individuals, but how soon, Mr Tanner said.
With enough patience, any cryptographer could connect the disparate bits of information to an individual, and the rest of the identities would unravel from there. âData sets of âanonymousâ data are fast becoming identifiable,â Australian Privacy Commissioner Timothy Pilgrim said. âPersonal information is not just that which does identify you, but that which may.â
The inherent difficulty in keeping personal data anonymous meant that public health data shouldnât be handed out to just anyone, Dr Teague argued. Instead, the government should create a secure research environment offline where de-identified data could be accessed by trusted research institutions for specific purposes.
If, in rare circumstances, data were to be placed in the hands of commercial entities, patients would need to give explicit permission, Dr Culnane said.
But the lines have now been blurred between research and business, with companies like Google volunteering to help government with analytics and app development. As DeepMind has demonstrated, these partnerships can produce exciting innovations. But they can also raise alarm in the context of limited regulation.
Big data is a genie that has just started to come out of the bottle. In the right hands, it can do enormous good. Itâs wise to be suspicious of corporate motivations but, to quote the author Ernest Hemingway, âThe way to make people trustworthy is to trust themâ.