Opting Out Is Always Rational

One of the most common memes used in support of mass health data projects is that the data supports important research. Whether it is disease causation, effective treatment, epidemiology, drug side-effects, researchers need large amounts of data, so your data matters.

But from the perspective of a patient, ie you, your data doesn't matter.

Your data would only matter if a study which looked at the whole dataset would have a different outcome with or without your participation. But in a dataset covering 47m people (the size of the Hospital Episode Statistics database) or around 53m people (the number of people registered with general practitioners in England, assuming everyone is), the chances of your individual record being anything other than statistical noise are infinitesimal. In order for that to be the case, you would have to be very unlike the rest of the dataset, but mass population studies rarely identify things that affect only one person. So there always be sufficient people who look like you to fill your place in the analysis. And of course, the chances of a medical breakthrough hinging on your personal data, _and_ being related to a condition you have, _and_ producing a change in treatment quickly enough to benefit you are similarly small. An infinitesimal chance of a very small benefit has a net present value of zero, for practical purposes.

On the other hand, the risk of the data being leaked, re-identified or otherwise mis-used is greater than zero. We don't know how much greater, and without a code of practice we can't calculate it. But if, for example, your health record in which you talk to your GP about your depression were leaked to your ex-spouse in a contested custody battle, the effect would be immediately harmful. That's an immediate risk: a small chance times a very large disbenefiit has a net present value considerable greater than zero.

Now the problem with this, of course, is that if everyone thinks like this, there is no data. But of course, they won't; Germany's scheme is opt-in, and yet has reasonable numbers of participants. But shouting yet more loudly about potential benefits doesn't work, because that has already been written down to zero. What needs to happen is calm, rational discussion about why people are over-estimating the potential harm such a project can cause. And without transparent, accountable organisations handling the data, that will never happen.

ian

1 response
It's a classic security economics externality. The Health Service (I use the term in the broadest sense) gets almost all the benefit of care.data, while the individual bears almost all the risk if its security is breached. As Ross Anderson and Tyler Moore have observed "Systems are particularly prone to failure when the person guarding them is not the person who suffers when they fail". You could hardly ask for a clearer example. Reference: The Economics of Information Security, Anderson & Moore, Science 314, 610 (2006) - http://tinyurl.com/qgb2fw6