25 April 2016
Last month, the Wellcome Trust published a report on public attitudes to commercial access to health data. The report presents findings from a series of workshops with members of the public, including patients and health professionals, as well as from a representative survey of 2017 people across the UK. I would like to highlight three findings that I believe are particularly important: acute awareness, particularly among 'lay' participants, of the power differentials between corporations and citizens; the significance of trustworthy governance and institutions; and the role of public value.
Lay people were particularly concerned with the profound power asymmetries involved in the collection and use of personal data. One participant spoke of a 'one-way mirror' that commercial companies were putting up: 'They know everything about you, but we don’t know what they are doing with that'. When institutional safeguards and procedures were considered trustworthy, respondents were happier for their data to be used in a wider range of contexts. Moreover, the purpose for which their data would be used mattered a lot to most respondents. Research that was seen to benefit patients, societies or future generations received the highest level of support.
The Wellcome Trust's report could not be more timely. Data are hailed the new currency of our societies, and 'digital' has become the adjective of choice when politicians speak of the economy. Against this backdrop, data governance is more important than ever. Policymakers are recognising this – at the EU level, the new General Data Protection Regulation (GDPR) will give citizens more control over who collects and uses their data, including the right to have personal data deleted in some circumstances.
But this does not go far enough. Some key distinctions that the GDPR – and many other data protection frameworks – uses are no longer suitable for the digital era. Because digital data can be copied, linked, and made accessible to others more easily than ever, the distinction between identified and de-identified data no longer holds; at least in principle, all data can potentially be re-identified. Similarly, as any dataset – even the most innocuous piece of information – can be used in health-related contexts (e.g. when predictive analytics identifies potentially 'costly' patients on the basis of, among other things, their online purchases) any dataset is potentially health-relevant, and any dataset is potentially sensitive.
Commercial companies are not the only type of organisation who could use data to harm people. Even if it were feasible to stop them from using personal data, this would not be a desirable solution. As illustrated in many places in the Wellcome Trust’s report, commercial companies can create public value in significant ways. To address the cracks that are appearing in long-held distinctions between public vs. private, identified vs. de-identified, sensitive vs. non-sensitive, and health-related vs. not health-related data, we should look beyond the properties of the data themselves and instead make distinctions based on the context of use of data.
One way to do this would be to differentiate between data use in the public interest vs. data use not in the public interest. This distinction holds irrespective of who the data user is. For data use in the public interest we should make it easier to use data, as many organisations are already doing. In addition, we should establish what Alena Buyx and I called Harm Mitigation Funds: these are independent bodies tasked with deciding on appeals from people who claim they were harmed by data use. They complement, rather than replace, legal systems of compensation. The funds themselves often come from earmarking a certain proportion of the sum that a particular organisation or project spends on data use. Harm Mitigation Funds can be established at the level of individual institutions, or they can cover an entire regions and nations.
The question of how to determine when data use is in the public interest and when it is not raises its own challenges. In most instances of data use will contain elements of both. But these challenges can, and must be, addressed. The Wellcome Trust's findings clearly indicate that we cannot honour personal autonomy – and consent as one of its manifestations – without ensuring that institutions are trustworthy and fair, and that people will not be left alone when something goes wrong.