Nav: Home

Study applies game theory to genomic privacy

January 16, 2017

It comes down to privacy -- biomedical research can't proceed without human genomic data sharing, and genomic data sharing can't proceed without some reasonable level of assurance that de-identified data from patients and other research participants will stay de-identified after they're released for research.

Data use agreements that carry penalties for attempted re-identification of participants may be a deterrent, but they're hardly a guarantee of privacy. Genomic data can be partially suppressed as they're released, addressing vulnerabilities and rendering individual records unrecognizable, but suppression quickly spoils a data set's scientific usefulness.

A new study from Vanderbilt University presents an unorthodox approach to re-identification risk, showing how optimal trade-offs between risk and scientific utility can be struck as genomic data are released for research.

The study appears in the American Journal of Human Genetics.

Doctoral candidate Zhiyu Wan, Bradley Malin, Ph.D., and colleagues draw on game theory to simulate the behavior of would-be data privacy adversaries, and show how marrying data use agreements with a more sensitive, scalpel-like data suppression policy can provide greater discretion and control as data are released. Their framework can be used to suppress just enough genomic data to persuade would-be snoops that their best privacy attacks will be unprofitable.

"Experts in the privacy field are prone to assume the worse case scenario, an attacker with unlimited capability and no aversion to financial losses. But that may not happen in the real world, so you would tend to overestimate the risk and not share anything," Wan said. "We developed an approach that gives a better estimate of the risk."

Malin agrees that failure to come to grips with real-world risk scenarios could stifle genomic data sharing.

"Historically, people have argued that it's too difficult to represent privacy adversaries. But the game theoretic perspective says you really just have to represent all the ways people can interact with each other around the release of data, and if you can do that, then you're going to see the solution. You're doing a simulation of what happens in the real world, and the question just becomes whether you've represented the rules of the game correctly," said Malin, associate professor of Biomedical Informatics, Biostatistics and Computer Science.

To date, no one has faced prosecution for attacking the privacy of de-identified genomic data. Privacy experts nevertheless assume a contest of computerized algorithms as de-identified data are released, with privacy algorithms patrolling the ramparts while nefarious re-identification algorithms try to scale them.

Re-identification attacks have occurred, but according to earlier research by Malin and colleagues, the perpetrators appear to be motivated by curiosity and academic advancement rather than by criminal self-interest. They're sitting at computers just down the hall, so to speak, overpowering your data set's de-identification measures, then publishing an academic paper saying just how they did it. It's all very bloodless and polite.

The new study is something different, more tough-minded, situating data sharing and privacy algorithms in the real world, where people go to jail or are fined for violations. Here the envisaged privacy adversary doesn't wear elbow patches, lacks government backing and is simply out to make a buck through the illicit sale of private information.

De-identified genotype records are linked to de-identified medical, biometric and demographic information. In what the study refers to as "the game," the attacker is assumed already to have some named genotype data in hand, and will attempt to match this identified data to de-identified genotype records as study data are released.

To bring these prospective attackers out of the shadows, the authors present a detailed case study involving release of genotype data from some 8,000 patients. They painstakingly assign illicit economic rewards for the criminal re-identification of research data. Based on costs for generating data, they also assign economic value to the scientific utility of study data.

On the way to estimating risk and the attacker's costs, the authors estimate the likelihood that any named individual genotype record already held by the attacker is included in the de-identified data set slated for release; according to the authors, this key estimate is often neglected in re-identification risk assessments.

The authors measure the utility of a study's genomic data in terms of the frequencies of genetic variants: for a given variant, the greater the difference between its frequency in the study group and its frequency in the general population (based on available reference data), the greater its scientific utility. This approach to utility triumphed recently when Wan and Malin won the 2016 iDASH Healthcare Privacy Protection Challenge. Their winning algorithm proved best at preserving the scientific utility of a genomic data set while thwarting a privacy attack.

For any genomic data set, before any data are released in a game's opening move, the sharer can use the game to compare various data sharing policies in terms of risk and utility. In the case study, the game theoretic policy provides the best payoff to the sharer, vastly outperforming a conventional data suppression policy and edging out a data use agreement policy.

No matter where parameters are set regarding illicit financial rewards or information that's likely to be wielded by an attacker, the authors show that the game theoretic approach generally provides the best payoff to the sharer. They sketch how their approach could serve the release of data from other sources, including the federal government's upcoming Precision Medicine Initiative.
-end-
Vanderbilt authors joining Wan and Malin in the study include Yevgeniy Vorobeychik, Ph.D., Weiyi Xia, Ph.D., and Ellen Wright Clayton, M.D., J.D. They are joined by Murat Kantarcioglu, Ph.D., of the University of Texas at Dallas.

The study was supported by grants from the National Institutes of Health (HG008701, HG009034, HG006844 and LM009989).

Vanderbilt University Medical Center

Related Privacy Articles:

COVID-19 contact tracing apps: 8 privacy questions governments should ask
Imperial experts have posed eight privacy questions governments should consider when developing coronavirus contact tracing apps.
New security system to revolutionise communications privacy
A new uncrackable security system created by researchers at King Abdullah University of Science and Technology (KAUST), the University of St Andrews and the Center for Unconventional Processes of Sciences (CUP Sciences) is set to revolutionize communications privacy.
Mayo Clinic studies patient privacy in MRI research
Though identifying data typically are removed from medical image files before they are shared for research, a Mayo Clinic study finds that this may not be enough to protect patient privacy.
Researchers uncover privacy flaw in e-passports
Researchers at the University of Luxembourg have discovered a flaw in the security standard used in biometric passports (e-passports) worldwide since 2004.
How cities can leverage citizen data while protecting privacy
In a new study, MIT researchers find that there is, in fact, a way for Indian cities to preserve citizen privacy while using their data to improve efficiency.
Cell-mostly internet users place privacy burden on themselves
Do data privacy concerns disproportionately affect people who access the internet primarily through cell phones?
Anonymizing personal data 'not enough to protect privacy,' shows new study
Current methods for anonymizing data leave individuals at risk of being re-identified, according to new research from University of Louvain (UCLouvain) and Imperial College London.
Study finds Wi-Fi location affects online privacy behavior
Does sitting in a coffee shop versus at home influence a person's willingness to disclose private information online?
Putting data privacy in the hands of users
MIT and Harvard University researchers have developed Riverbed, a platform that ensures web and mobile apps using distributed computing in data centers adhere to users' preferences on how their data are shared and stored in the cloud.
Social media privacy is in the hands of a few friends
New research has revealed that people's behavior is predictable from the social media data of as few as eight or nine of their friends.
More Privacy News and Privacy Current Events

Trending Science News

Current Coronavirus (COVID-19) News

Top Science Podcasts

We have hand picked the top science podcasts of 2020.
Now Playing: TED Radio Hour

Climate Mindset
In the past few months, human beings have come together to fight a global threat. This hour, TED speakers explore how our response can be the catalyst to fight another global crisis: climate change. Guests include political strategist Tom Rivett-Carnac, diplomat Christiana Figueres, climate justice activist Xiye Bastida, and writer, illustrator, and artist Oliver Jeffers.
Now Playing: Science for the People

#562 Superbug to Bedside
By now we're all good and scared about antibiotic resistance, one of the many things coming to get us all. But there's good news, sort of. News antibiotics are coming out! How do they get tested? What does that kind of a trial look like and how does it happen? Host Bethany Brookeshire talks with Matt McCarthy, author of "Superbugs: The Race to Stop an Epidemic", about the ins and outs of testing a new antibiotic in the hospital.
Now Playing: Radiolab

Speedy Beet
There are few musical moments more well-worn than the first four notes of Beethoven's Fifth Symphony. But in this short, we find out that Beethoven might have made a last-ditch effort to keep his music from ever feeling familiar, to keep pushing his listeners to a kind of psychological limit. Big thanks to our Brooklyn Philharmonic musicians: Deborah Buck and Suzy Perelman on violin, Arash Amini on cello, and Ah Ling Neu on viola. And check out The First Four Notes, Matthew Guerrieri's book on Beethoven's Fifth. Support Radiolab today at Radiolab.org/donate.