Sophos News

Why this doctor posted his medical history online for anyone to see

Would you want your friends and family to know your entire medical history?

How about sharing your personal health information with your employer, potential employers, insurance companies – anyone at all?

One doctor, John D. Halamka, MD, has posted his entire medical history online, and he wants others to do the same.

His wife, daughter, mother, father and father-in-law also volunteered to make their medical records publicly available for demonstrations of electronic health records and health information exchanges.

In a biographical article published on Politico.com, Halamka explains his decision and rationale for sharing his medical history and his genome as part of the Personal Genome Project.

Dr. Halamka isn’t a run-of-the-mill family physician – he’s a professor, chief information officer (CIO) and dean of technology at Harvard Medical School, the CIO at Beth Israel Deaconess Medical Center in Boston, and practices emergency medicine.

He’s also chairman of the New England Health Electronic Data Interchange Network (NEHEN), CEO of a regional health information organization, and chair of the US Healthcare Information Technology Standards Panel (HITSP).

Clearly, this is a highly educated, credentialed and technologically-savvy person – so why is Halamka, in his words, voluntarily giving up his privacy?

Halamka says it’s for science – specifically, the data in his genome and medical records can be used for clinical trials and medical research.

It’s not like Halamka is putting his Social Security number out there for anyone to steal his identity (as the CEO of LifeLock did, with predictable results). That wouldn’t be wise.

Cybercriminals have breached millions of records at hospitals and major health insurers like Anthem in recent years not because they’re looking to find embarrassing medical facts about patients – they want personally identifiable information like Social Security numbers, names and addresses to commit identity theft and fraud.

But the information Halamka is giving away is highly personal, even if it’s not particularly embarrassing.

I’m no doctor, but examining his medical history it’s clear to me that Halamka is a very healthy 53-year-old: he’s a vegan who doesn’t smoke or drink alcohol; exercises at least 10 hours per week; has no unresolved medical conditions; and his family history shows no risk of cancer, diabetes or hypertension.

Halamka readily admits that this kind of openness is not for everyone – nor should it be.

To quote Halamka:

I am very sensitive to the fact that everyone has different privacy preferences. Data in the medical record could include sexually transmitted diseases, HIV status, domestic violence, substance abuse, and mental health issues that individuals do not want to share openly. I absolutely respect that.

He says sharing of health records should be “patient-driven,” and calls for legal protections for those who share their information publicly.

There are laws prohibiting medical discrimination by employers and insurers – such as the Genetic Information Nondiscrimination Act, and the Patient Protection and Affordable Care Act (“Obamacare“) in the US.

But existing laws might not be enough to protect people against discrimination – computer programs that slice and dice our data can learn things about us, including our health, even if we don’t publish our medical records online.

You’ve heard about “Big Data,” and how it can increase efficiencies in a host of ways. It can certainly be used improve medicine and the quality of healthcare.

But the vast quantities of data collected about us can be used in ways we might not ever know about, understand, or approve.

Data about our activity on Facebook can be used by algorithms to predict outcomes – such as substance abuse or sexual orientation – better than even our friends and family.

Algorithms also tend to discriminate against minorities, women, and other subjugated classes of people, like the poor – a kind of “data redlining.”

For instance, researchers recently discovered that men are far more likely than women to be shown ads on Google that are related to high-paying jobs.

And because machine-learning algorithms evolve based on what people do online, they tend to reinforce people’s prejudices – as an example, web searches involving names more closely identified with black people are more likely to turn up ads with “arrest” in them than searches for white-identified names.

As a Big Data review ordered last year by President Obama found, this kind of data-driven discrimination can have profound effects:

... outcomes like these, by serving up different kinds of information to different groups, have the potential to cause real harm to individuals, whether they are pursuing a job, purchasing a home, or simply searching for information.

Halamka argues that, because his medical records are out in the open for anyone to see, he doesn’t have to worry if hackers get them.

And that’s his choice.

Yet in our data-driven world, technology providers and policymakers should put the proper protections in place to make sure health openness or privacy is a choice for everyone.


Image of physician on a tightrope courtesy of Shutterstock.com