Supreme Court’s decision on abortion sparks health tech’s Cambridge Analytica moment

In 2018, the tech industry found itself in a harsh spotlight amid a scandal involving a company called Cambridge Analytica, which had collected and used the data of millions of Facebook users, seemingly without their consent. It prompted a public outcry, congressional hearings, a $5 billion fine, and permanently altered the discourse around how social media firms use data.

In the wake of a Supreme Court decision overturning Roe v. Wade, health data privacy is getting its own Cambridge Analytica moment.

With viral calls to delete period-tracking apps and fears that health records could be used to prosecute people seeking abortions who reside in states where it’s now illegal, multiple experts invoked Cambridge Analytica as a reference point for the scrutiny health data practices could rightfully draw. They expressed hope that health care companies would examine their practices but also said that lasting change will require federal policy changes and the teeth to enforce them.

advertisement

“We’ve seen Facebook do better in terms of its privacy policies and in terms of the clarity and in terms of individuals’ ability to dictate what data gets shared and what doesn’t,” said Megan Ranney, a professor at Brown University and director of the Brown-Lifespan Center for Digital Health. “And I think it may be time for that type of reckoning for us in the health sphere. I would say it’s overdue, but maybe now is the moment.”

The 26-year-old Health Insurance Portability and Accountability Act, or HIPAA, generally governs how health care organizations can and can’t use patient data, and how they need to secure it. But experts say HIPAA now has glaring weaknesses in the modern world of medical data. Kristen Rosati, an attorney and former president of the American Health Law Association, noted that the law doesn’t protect sensitive data from being disclosed to law enforcement, nor does it cover countless health devices, apps, or online searches, where people unwittingly share intimate health information that can be shared, sold, or aggregated.

advertisement

“It is a really chaotic environment,” said Lisa Bari, CEO of Civitas Networks for Health. “And the lack of a national data privacy protection law is hurting everything. It’s hurting people’s health. It’s hurting people’s privacy. It’s making it hard to exchange data for permitted purposes.”

Consumer data is currently regulated by a hodgepodge of state laws as well as the Federal Trade Commission, which has rules that prohibit misleading claims. A bipartisan bill currently working its way through Congress would help provide some consumer-empowering rules for data collected by health apps. The Department of Health and Human Services’ Office for Civil Rights also oversees the use of health data covered by HIPAA, and has already issued guidance on when reproductive health data must be disclosed to law enforcement.

The wild west of health data has led to numerous smaller-scale scandals over the years: Google’s effort to build up its health care business with millions of patient records from Ascension drew criticism. Flo, which makes a period-tracking app once again in the spotlight, settled with the Federal Trade Commision last year over allegations it misled users about the privacy of their data. And just weeks ago, an investigation revealed major hospital systems were sending health information to Facebook, which experts said could be in violation of HIPAA.

But health data privacy has never been tied quite so closely, or so publicly, to an event of such widespread consequence.

It’s essential that companies that collect data or build health care products carefully consider the potential consequences, said Christine Lemke, co-CEO of Evidation Health, which helps companies perform consumer health research. She said that there are analogs in health care to the way Facebook’s algorithms prioritized incendiary content that can contain misinformation during the last national elections.

“It’s going to lead to unintended secondary consequences. Somebody using the data from a pregnancy app to put a woman in jail,” she said. Today the issue is reproductive health — but in the future, information seeping out from health apps or services could be used in other discriminatory ways. “We ought to be thoughtful about these things.”

Andrea Downing, president and co-founder of the Light Collective, put it more bluntly. She said companies collecting data ought to “think immediately about how this can be weaponized against somebody? How can this be used in the hands of the most evil person doing the worst things with it? … And then when we design something, whether it’s a study, whether it’s technology, we have to be in a place where we protect from those harms from the get-go.”

That companies don’t more often think through how their data or products are used could be attributed to naivety, negligence, and business motives. For some companies in the space, the bulk of their business comes from buying and selling data to pharmaceutical or adtech companies for marketing purposes.

In the absence of a change in federal policy, experts said there are immediate steps organizations could take to improve privacy practices — for some, it’s as simple as doing the bare minimum of ensuring the developers they work with don’t unintentionally send private health information to third parties.

Even companies that aren’t covered by HIPAA can voluntarily submit to its rules, Ranney suggested. Rosati said that consumer apps could provide a way for people to immediately delete their data, an option some state-level privacy laws already require and one likely to be included in a national effort. Apps that send data to third parties could carefully consider whether that’s necessary, and examine any partners they do share data with closely. Some services could leave all data on devices, or hide it behind end-to-end encryption that users control. Some sources hinted that voluntary privacy certifications from a trusted nonprofit or guidelines from the Food and Drug Administration could help encourage better behavior.

Michelle Dennedy, who’s developing PrivacyCode, a new startup to help companies manage these issues and has worked on privacy at tech firms for decades, said that in light of legal liability created by data, health care organizations ought to “proactively invest in a few projects” to investigate what data they are collecting, how they are using it, and what is really needed for compliance and to drive positive health outcomes.

There’s evidence already that the gears may be turning on voluntary change: Flo announced it would work on an “anonymous mode” that would shield user data.

But while public pressure might encourage some companies to step up, privacy experts said many companies will simply choose the path of least resistance — which means flouting privacy until they can’t.

“Whenever it’s a question of their bottom line, they’re not doing the right thing,” said Downing. “We don’t need regulation for good actors. We need regulation for when nobody is looking.”

Even with a national privacy law in the works, experts say there is much to do. Enforcement of existing laws remains light, and HIPAA still needs substantial reforms to reflect the advent of big data, information sharing, and research that the lawmakers who crafted it simply didn’t foresee.

Lucia Savage, the chief privacy and regulatory officer at Omada Health, said that while she’s seen “swelling” consumer interest in privacy since the Cambridge Analytica scandal, such discussions are usually relegated to “Beltway arguing about privacy.” If more people start taking their fears directly to their elected officials, though, it could influence lasting change.

“Congresspeople like nothing better than a constituent telling them a personal story,” she said “That’s literally the most valuable thing that can happen.”

Source: STAT