Santa Cruz Tech Beat

Events

Robust Debate: Privacy for the Era of Big Data

By Jan Janes
Special to Santa Cruz Tech Beat

(Photo above: Michael Troncoso, UCSC general counsel, moderates the DataLex Symposium morning panel, Privacy and Genomics (l-r) Melissa Bianchi, Atul Butte, Roslyn Martorano, Barbara Evans, Olena Morozova and Rachel Nosowsky. Credit: Jan Janes)

October 16, 2015 — Santa Cruz, CA

Privacy, Big Data and the Law

UCSC sponsored its first DataLex Symposium Tuesday, Oct. 13, attracting scientists, attorneys and technologists to a robust debate about how privacy concerns and individual rights collide with technology innovation and healthcare research in the growing arena of Big Data.

Chair of the event and UCSC chief campus counsel Michael Troncoso welcomed the guests, shared an expansive agenda for the day, invited their participation and thought leadership, and introduced opening speaker Jim Dempsey, Executive Director of the Berkeley Center for Law & Technology.

Solutions to the Privacy Conundrum

Dempsey agreed up front: the potential benefits of Big Data, throughout the UC system and beyond, are a given.

Consumers expect privacy, confidentiality and some level of control. In daily life, data is routinely and often ubiquitously collected by a third party. How is information used in a fair way?

Dempsey outlined traditional FIPPs (Fair Information Practice Principles) to include:

  • Notice and awareness
  • Choice and consent
  • Access and participation
  • Integrity and security
  • Enforcement and redress

FIPPs are the alphabet soup of current privacy laws at the local, state and federal level, reposited in HIPPA, CLB, RCRA, RFPA, FERPA, CPNI rule, COPPA, ECPA, Privacy Act, Cable Act, VPPA, DPPA, and implementation is very complex.

Jim Dempsey, Executive Director of Berkeley Center for Law and Technology, delivered the opening address to the UCSC DataLex Symposium Oct. 13. (Credit: Jan Janes)

Jim Dempsey, Executive Director of Berkeley Center for Law and Technology, delivered the opening address to the UCSC DataLex Symposium Oct. 13. (Credit: Jan Janes)

Funneling to a simpler matrix, Dempsey discussed the importance of transparency, purpose and respect for context. He suggested that, as researchers, “The one that trips us up the most is use limitation, when data collected for one purpose is used for another purpose, without the express consent of the subject.”

Big Data is all about unanticipated uses of data, and of taking data intended for one purpose and using it in other ways. FIPPs adherence becomes impossible in this new context.

For technologists who think this only applies to researchers, think again. Current regulations include the requirement of information security: Anyone who collects the data is obligated to protect the data.

“There’s been a lot of talk about redefining privacy for the era of Big Data.”

Dempsey suggested a different model to achieve fairness and respond to new demands created by Big Data, and he asked the audience to temporarily set aside standard practices. Instead, build a hypothetical study that involves people, but use a different set of questions:

  • How will you use the data?
  • What rights will individuals have?
  • Will there be tertiary use of the data?
  • Will the data be stored centrally?
  • Will individuals be identifiable or re-identifiable?
  • Will you use the data for other purposes?
  • Will you share the data with others?
  • How will the data be protected?

In closing, Dempsey cited Kuner etal in acknowledging “issues such as accessibility, accuracy and reliability may matter … more than privacy.”

Privacy Expectations and Genomic Data Research Collide

Using the opening address as a frame, Troncoso introduced a distinguished panel of speakers to explore the question: Do strong privacy protections thwart research and harm the advance of precision medicine?

“Wow, I’m surrounded by lawyers, I must have done something wrong,” said Atul Butte, MD, PhD, Director of the Institute for Computational Health Sciences and professor of pediatrics at University of California San Francisco, eliciting laughter from the audience.

Pointing to the extensive genomics research done at UCSC and throughout the UC system, it is evident society has entered a genomics revolution. He shared the different genomic tests that can be performed, their relative costs, and the sometimes narrow thinking about medical definitions, including cancer. We are learning, Butte said, “No two cancers are going to be the same.”

“We have changed in how we are doing a lot of the research,” he said. “The basis it comes from is that people are dissatisfied with how medicine is practiced. Maybe we’re not doing the right thing.”

“We see the world differently, and many of those differences lie in our DNA,” said panelist Olena Morozova, a post-doctoral scholar at UC Santa Cruz Genomics Institute leading the Treehouse Childhood Cancer Project.

“You can imagine that the cancers we get are also going to be different, because they are based on our DNA.”

Morozova argued that every cancer is a rare disease, because no two individuals are the same. She described the case of a Treehouse patient with an unusual tumor, scanning the existing database for information and prescribing therapy based on database findings.

“We are very willing to donate our blood, donate our bone marrow, organs even,” she said. “To cure everybody’s cancer, we need everybody’s data.”

So, why not donate our data?

In support of the human urge to share, Butte presented a different profile of the public’s willingness to forego privacy, noting the recent milestone set on Facebook. One billion people used that social media platform during a single day, August 24, to share personal information, photographs and other data.

Panelist Rachel Nosowsky, with the UC Office of General Counsel, circled back to the opening remarks and Dempsey’s fairness frame of privacy. “What we have missed in the FIPPs conversation is the idea that there are other values besides privacy for its own sake.”

Instead of the simple question of privacy vs. no privacy, she suggested it is actually a question of privacy vs. health. When posed as a simplistic dichotomy, it precludes comprehending a deeper connection. Research and public health unite in a ‘learning healthcare system’ that researchers are just now beginning to talk about. She said the health system can advance only if practitioners continually examine and evaluate its adequacy.

Explaining the false protection of the privacy umbrella, she said, “When people opt out, or don’t opt in, it creates bias” of not enough information. In the case of the Treehouse cancer patient highlighted by Morozova, the available database was 10,000 cancer profiles.

Nosowsky posed a Big Data shift. “What if we had 10 million profiles, with an even greater chance of targeted treatment options?”

Having something vulnerable to protect

Following reports of data breaches affecting the companies of insurance provider Anthem, Ashley Madison and SONY, Butte wondered if people see themselves as warriors protecting their privacy.

Panelist Barbara Evans, George Butler Research Professor and Director of Center for Biotechnology & Law at the University of Houston Law Center, proposed a different profile, noting people are fearful something in their genetic data will create vulnerability or potential discrimination.

Citing existing law that protects access to healthcare or prevents employment termination based on genetic background, she took issue with, “The notion that we should withhold our data to fight discrimination. We should go after the problem, not have people hiding their data.”

“I think we’re ultimately as crazy as the number of secrets we keep,” Evans said. She suggested, not as policy, but as hypothetical argument: In the face of an enormous health data breach, people might discover others who have the same embarrassing problems.

What is the right balance between privacy and fair disclosure?

According to Roslyn Martarano, privacy manager systemwide for the University of California, you can have an abstract discussion, or you can discuss the need for privacy on a case by case, contextual basis. In her experience, people in research want to know exactly what they can and cannot do. “The problem is that innovation is outpacing our ability to have those discussions.”

She also discussed the concerns about when, not if, there are data breaches in the future, as well as problems that occur when researchers require that people take certain actions, or make decisions for them. “We try to get informed consent, but patients don’t understand what we mean. They see all that legal language, they immediately shut down. They say ‘I’m not comfortable with this.””

When participants opt out of sharing personal medical data to protect their privacy, they shrink the information pool accessible by researchers. Early screening and diagnosis for major diseases, such as breast cancer, offer targeted treatment plans and avoid a one-size-fits-all approach. In the world of analytics, robust data sets help predict who might get different diseases. Opt-outs add bias into studies. Should patients get to say, ‘I’m opting out’ and deny everyone else information?

According to Butte, FIPPs is about fairness. Is it fair for data generators? Is it fair for data users? Is it fair for those who need the data? “In my experience, the sickest of the patients are the most likely to share.”

Social rights are cultural, and privacy is a social right

Evans, describing long held cultural beliefs, said many people feel very protective of their health data. No matter how valuable the information might be to others, they still want to protect their dignity and have control over their data.

However, she said while “we have a long legal tradition that we don’t touch people’s bodies without their consent, ‘touching my data’ is not the same as ‘touching me.'”

An additional concern expressed by patients is a lack of control about how their data will be used. They won’t know what will become of their information at a later date. In sharing research data, the barrier to entry for families who are already battling a known disease is lower, and the perceived return is greater.

A little history, a little bit of law

Two global experts, Melissa Bianchi, partner at Hogan Lovells, who leads their digital health initiative, and Rachel Nosowsky discussed the history and ramifications of the Federal Common Rule, adopted in 1981 by more than a dozen federal agencies. It regulates research standards of all biomedical and human research studies receiving federal funding. It also provides the guidelines to gain appropriate consent and the need to do precision research.

The Common Rule has not changed much since adoption. As a framework for ethical research, it minimizes risks to participants, informs them their participation is voluntary, and that research is reviewed by independent ethics groups to assess the risks and benefits on the individual participants.

According to Nosowsky, it is an example of old style regulation. “It is pretty high level, open to a great deal of interpretation, which creates both opportunities and challenges for institutions and for participants.” It is applicable in the United States and reaches overseas. Some of the major changes being proposed include exempting more categories considered low risk.

Bianchi spoke about an advance notice proposal for recommended rules, originating in 2011. According to her, “It has taken this long for the agency to come back and try again. Researchers were not at the table and they did not comment as a community when HIPPA regulations were promulgated.”

The people writing the regulations did not hear from the researchers most impacted by suggested rule changes.

Do rule changes favor private companies over federally funded research?

According to Bianchi, “The biggest proposed change is that [human] biospecimens, even with no identifying data attached, will be considered human subjects research. HIPPA currently has guidelines whereby if specific identifiers are stripped from data, a biospecimen can be de-identified and research can continue.

“This will change the game for all the biospecimen research out there…and change the way tissue can be collected, studied and researched,” said Bianchi.

In response, Evans noted this new proposal creates a “bioethic, biospecimen exceptionalism, by saying biospecimens are intrinsically identifiable. Many other things are intrinsically identifiable. If researchers have your gait from your mobile wearable device, that is uniquely identifying to you.”

“The type of science needed will relate genomic data to very rich, longitudinal data. Any rich parametric data set is intrinsically identifiable,” said Evans.

Further discussing the proposed rules change, Nosowsky expressed a concern that the original balance is lost, and real opportunity for ethics board waivers of informed consent is lost.

“The impact may be the privatization of genomic data sets.”

She went on to describe the continued loss of state and federal research funding. Large research universities will still be held to a strict set of standards. By contrast, private companies not reliant on federal funding can pursue research without the burden of Common Rule, IRB and HIPPA data regulations.

Bianchi said, “It will push that research to private, commercial enterprise, because they won’t be subject to these rules.”

Should consumers be skeptical about commercial research privacy risks?

Butte suggested many reasons to have precision medicine open to commercial delivery mechanisms: to get better, faster research; to stop using FIPPs as a protection for ineffective, cumbersome medical facilities; and to champion the little guy and focus on rare conditions the big institutions don’t want to bother researching.

Morozova, project lead of Treehouse Childhood Cancer Project, suggested giving the power back to the participants, asking them what they want, providing full disclosure about how their data will be used and letting them know of the risks. “Maybe the data will end up in the public space,” she said, “but maybe that’s okay because it’s going to save somebody’s life.” She also suggested data protection and encryption practices to minimize the risks.

Concurring with her, Evans shared philosophy from a recent speech and said people expect their data to be used in ways that will benefit the public and create good. She dispelled the concept that noncommercial uses of data are good and commercial uses are bad. “Everything universities do is commercial,” she said. “We need to get back to basics and quit using money as a proxy for what’s good and what’s bad.”

From the audience, one person raised the possibility that expanded data sharing could lead to negative outcomes. What about cloning, or other civilization changers? Do we protect against those outcomes by not risking data exposure, and as a result not finding solutions to our problems?

As a further audience query, what of people who want to do good, but lack the sense of public trust? Is the system really rigged toward commercial interests that will have access to the data? How do we create databases that are part of a public trust, an asset belonging to the public?

The Blue Button Download

Using the ‘blue button’ download, VA Hospital reports 500,000 active participants and insurance company Aetna reports 36 million participants who have downloaded their health information, according to Butte. Is their data safe and protected, or have they copied it to personal yet insecure computers, loaded with spyware?

He expanded on the idea that much data is stored insecurely, but shared his view that democratizing the data will help more people get the solutions they need, which is, in itself civilization changing.

Bianchi, agreeing that research data ought to be protected, said she finds in her legal practice the largest risk is amazing, brilliant researchers who struggle to keep control of their laptops, on which so much of their data is maintained. “People who have one amazing skill set don’t necessarily have the other,” and suggested encryption and other protections to heighten security.

Creating a public trust

Addressing the repeated concerns about data protection and breaches, another audience member suggested if you can’t distinguish who you trust and who you don’t, you have no trust and no security. Within circles of trust, systems can be secured with a high degree of confidence.

To conclude the session and address the issues of privacy and security, Butte cited the Anthem data breach, noting the company still attracted a 2% increase in customers. They did it by demonstrating transparency.

“It’s our responsibility to earn a similar trust, if we are transparent and show proper, safe, effective reuse of data. It’s not about the money, it’s about the public good,” Butte said.

###

Jan Janes, Communications Director, Web Publisher, Television & Radio Producer, Photojournalist, Social Media Architect, can be reached on LinkedIn.

###

If you wish to republish this article, please follow our Terms of Use.

###

Sign up for our free weekly email digest!

  • This field is for validation purposes and should be left unchanged.

Follow Now

Facebook Feed

This message is only visible to admins.
Problem displaying Facebook posts.
Click to show error
Error: Server configuration issue