By Susannah Fox, Jane Sarasohn-Kahn and Lisa Suennen
I’ve lived long enough to have learned
The closer you get to the fire the more you get burned
But that won’t happen to us
Cause it’s always been a matter of trust
-A Matter of Trust, by Billy Joel
If you’re in health care and you don’t live under a rock, you have probably heard that United Health Group (UHG) has acquired PatientsLikeMe (PLM). After the announcement, there was a lot of sound and fury, some of which signified nothing, as the saying goes, and some which signified a lot.
Three good friends – Susannah Fox, Jane Sarasohn-Kahn and Lisa Suennen — got to talking about this and realized we had so much to say we just had to write it down in one giant melting pot of prose– a trifecta of thoughts about this transaction but, more globally, about the entire burgeoning phenomenon of data as a business and patients as…People? An asset to be sold or bartered? A sum of their data? We hope it’s the former, but we also worry it’s the latter two at times.
Get a cup of coffee and sit back, because this is a long one – not the usual short and sweet entry that each of us endeavors to craft on our respective blogs.
First, some of the backstory: PLM was way ahead of its time, recognizing the power of peer-to-peer connections and the importance of data as currency long before many others in health care figured that out. Founded in 2004 by Ben and Jamie Heywood (with a friend, Jeff Cole) as a forum to share patient information about amyotrophic lateral sclerosis (ALS), or Lou Gehrig’s disease, when their brother Stephen was diagnosed with the disease, the platform grew to attract nearly 700,000 patients talking and sharing wisdom with each other about a multiplicity of health care conditions. They also started DigitalMe, a research program where patients shared actual biologic samples with PLM to make for more comprehensive data sets.
PLM shared data with academic researchers to allow scientific advancement to flourish in partnership with patients, who could suggest research questions based on their own experiences. And PLM did business deals with large pharmaceutical companies who wanted the data to advance their research and development efforts – that’s how the PLM business was primarily funded. Patients knew this – PLM was emphatic about transparency in its community and the company never hid what they were doing with people’s data. PLM and its community are well aware that the data collected by traditional clinical trials and medical records is but a fraction of what an individual knows about themselves and their experience of life and illness. PLM built a vessel into which people could pour their experiences, their biomarkers, their life hacks. Not only that, they directed the streams of data to researchers who were standing by, catching every drop, like phlebotomists filling vial after vial. This data would not go to waste and patients had a forum to act on their own desires to carefully document and share their experiences for the benefit of themselves and others. For more color, read Jane’s April 2008 report, “The Wisdom of Patients,” where she wrote about PLM in its start-up phase (see section on page 20 entitled “An Ongoing, Live Outcomes Study”).
PLM built custom software, stored a staggering amount of data, and secured access to it, all while making constant improvements to the user experience. And they invested in people because the communities they built require a human touch. But they were never able to reach scale and lacked a viable, longer-term business model.
To address that challenge, in 2017 PLM secured a $100 million (controlling interest) investment from iCarbonX, a Chinese company that is amassing patient data to discover cures for disease. When this occurred, it didn’t cause more than a ripple of interest outside biotech. But the U.S. federal government’s oversight wheels began to grind into a different gear, signaling that they did not like that a Chinese firm was gaining such access to U.S. citizens’ personal data. Following a review by the Committee on Foreign Investment in the United States (CFIUS), the government mandated that iCarbonX divest its ownership in PLM. We three don’t know what process ensued and what bidders came to the table, but in the end, it was UHG that stepped into iCarbonX’s shoes to become the new owner.
This story creates many interesting avenues for debate (e.g., is it bad for a company based in China to have potential access to Americans’ personal data, even if PLM’s plan was to have the U.S. data remain with a separate entity with a Board made up of Americans? Would companies based in other countries be “better” than one based in China? Was this kind of action the real intent of CFIUS? Is CFIUS stifling legitimate investment and innovation, or not? etc.). But what is interesting in this particular story’s context is that few said much about it in either social media or the trade press when the original iCarbonX deal was done. There was discussion, but nowhere near the response created by the UHG/PLM announcement.
In some ways the UHG/PLM deal is a bit of a Rorschach test for those intersecting with the health care field, be they providers, payers, researchers, or patients; one’s view of this deal is intrinsically intertwined with the pre-existing perceptions of the observer.
If you see the U.S. health care system as fundamentally broken and inhumane, you may see this deal as either another step toward the abyss or a bridge across a chasm.
If you see the collection of mass amounts of data into a big vat, sprinkled with AI, as a way to cure the health care industry’s ills and produce better outcomes, you may see this as a promising step in the right direction.
If you know the founders and history of PLM, you are likely to see this as an unexpected, but not radical, turn in their upward climb toward transparent and collaborative discovery of treatments for life-changing conditions. In fact, if you knew that the business model of PLM was always to package and sell data to researchers, including the pharmaceutical industry, you may shrug your shoulders and assume there is no fundamental change. But if you were not aware of that fact, you may be worried about who, exactly, now has access to the data.
If you see PLM as a suite of supportive patient communities, a few notches above Facebook Groups, you may mourn the loss of what you perceived as an independent forum for patients to share information about their health care challenges.
If you are alarmed by the incessant stories about cybersecurity breaches, particularly those that have affected insurance companies, you may see this deal as a significant new danger for patients and their data.
But that is a whole different vector than the “good vs. evil” discussion that is going on in the context of insurers, Big Pharma and patients. The debate around this transaction is so complicated and so fraught that we wanted to try to boil it down to the core issues – to dis-aggregate the Rorschach images, if you will.
Some very fundamental questions are at stake: the right of privacy, the right to agency over one’s own body (and, by extension, the data related to it), the right to be free from discrimination, among others. We are living in a time when many of these rights, thought to be “settled law” in the United States, are becoming anything but that in multiple contexts. This puts a particularly fine point on the importance of understanding what’s driving the reaction.
And it has become increasingly clear that people/consumers/patients are generally interested in sharing information that would help others and/or help themselves. A 2018 national survey found that 39% of U.S. teens and young adults say they have gone online to try to find people with health conditions similar to their own and 61% say they have read, listened to, or watched other people share about their health experiences online. Some people care and some don’t about the profit motives that may attach to use of the data that drive discovery. Some want to include all their stories, both biologic and psychologic. Some don’t want to share feelings or community, but they do want the facts so they can make decisions. Whatever. In the end it boils down to this: people want to decide their own fates, want a right to help others or not help others, want a sense of control over those aspects of their lives which are most personal, and want to exercise their right against self-incrimination (such as in the risk of discrimination by malevolent use of your own data against you), even while sharing information. These interests, like many other battles being fought around the concepts contained in the Bill of Rights these days, can be complicated.
As Jane, Lisa and Susannah further dug into this discussion, what emerged was a clear set of vectors into which all of the issues fall. These are: Perception, Trust, Risk and the Uncertainty Produced by an Ever-Changing Context, categories that sweep in all of the issues at hand. So, let’s dig into each of those a bit, shall we?
Perception: We are living in a moment in time when pharmaceutical companies, health insurance companies, and hospital systems have been demonized in the public discourse. The usual reasons are related to withholding of necessary care, excessive cost, surprise patient billing, opioids, and numerous other factors that underpin a lot of rational negative responses. So, it’s not that surprising that some people might look at the UHG/PLM transaction through their already negative-colored glasses. “Oh no! Give people’s data to a big insurer? How could you?”
Of course, few have more of your data already than those very same organizations and, thank goodness, they are there to also help us get access to care and get it paid for. Organizations like UHG are much more than “just” insurance companies at this point: UHG has an active clinical research organization working to improve the outcomes of treatment; they have a significant tech/data business; and just around the time the PLM acquisition went public, UHG’s acquisition of DaVita, the primary care medical group, was completed. It is impossible to paint them with a one-color brush. Public perception has not caught up to the reality that the lines between health care companies are so blurred at this point that it is becoming nearly impossible to tell the insurers from the providers from the pharma/medtech manufacturers from the retailers from the tech and data analytics companies. Lisa gave a talk about this last year, accessible here.
We are also living in a time when people are very confused about how to view big tech companies, like Amazon and Apple and Google and Facebook–the so-called “FAANGs.” Sometimes, they are our best friends, supporting our daily life-flows, social supports and entertainment; sometimes, not so much. They give great consumer experience – few argue with that. Some are better than others at data privacy and/or transparency. For some people, it doesn’t matter at all – they simply don’t care that much about data privacy. Others perceive the relentless pursuit of data as predatory. Where you sit is where you stand.
Regardless of where you sit or stand, we have to acknowledge that the companies are also the creators of jobs filled by actual people who are also committed to doing good, not just well. Sure, there are exceptions to that, but there are exceptions to that at EVERY organization. One can probably find examples of and exceptions to that at your local gas station. In the end, we have to figure out not just who is technologically capable of being a good steward of people’s data, but who is morally responsible and financially responsible for this task. We also need to separate out whether it even matters if such a steward is a for-profit or a not-for-profit entity – there seems to be some perception out there in the world that not-for-profit entities are, by definition, “better” or more “ethical”. That is not so obvious to us. What will really matter is the true and demonstrated commitment to transparency, openness and legitimate and safe citizen science. People and organizations are becoming increasingly aware that their data is a powerful and valuable asset so the rules of engagement will matter greatly as these perceptions evolve.
Trust: Who should be in one’s circle of trust? How should we, as people, calculate that? People trusted PLM and that’s why they freely gave their information and enabled PLM to share it. One of the biggest expressed concerns about this particular transaction is that the patient-organization trust relationship could change. This is, in part, because people in the U.S. have a lived experience of not getting insurance coverage when they needed or wanted it. People understand the potential power insurance companies have over their lives and fear that companies’ access to more data could make that power grow in ways that could be used against them.
Interestingly, people don’t ascribe that same power to tech companies, who have so much data on us from so many sources beyond medical claims that it makes insurers look entirely benign. Yet the trust relationship with technology companies, even after the 2016 Cambridge Analytica event, is different and it’s clearly stronger than it is with health insurers. Should it be?
One thing we heard is that PLM engaged its patient Board of Advisors in the discussion and design of this transaction and that their interests were well-represented in the deal process, ensuring that the trust relationship was safe and that the transparency that they had become accustomed to would continue. That’s a great story to hear, and it’s noteworthy that UHG was listening as well. It’s this type of action that will be essential to increasing and underscoring the patient-trust relationships as more and more data deals are announced every day among all sorts of health care system players.
Risk: What’s worse: the risk of dying from a disease that could have been treated if more data was available, or the risk of your data being seen/used by others to do discovery? That’s literally the trade-off patients are making when they open up their bodies and share what’s in them. It’s true that many aren’t quite as eyes-wide-open about that, but many are. And for some, the risk of their data “being out there” is far better than the risk that no one is able to help them or their children fend off what God or biology has wrought. Maslow’s Hierarchy of Needs is right on target here – if one is facing death or grave disability, the other risks fade to the back.
Fear about an Ever-Changing Context: When PLM first formed back in 2004, a lot of the data sharing issues we speak about hadn’t even been conceived of yet. The Internet cloud was not yet in the sky; data as a service was a business model barely in diapers.
Today, not only have the business models evolved, but the technology and the legal environment are changing so fast that we can’t keep up with our own decisions. Imagine if, in 2004, you shared data with PLM (or any organization) with an expectation that it would always be shared in a de-identified way. Now fast forward 15 years and you might wake up to find that the power of computing and the integrations of so many sources of data (credit card purchase data, retail spending, online search, genomic, phenotypic, and on and on) could potentially be “reaggregated” in a manner that your health data could be re-identified after the fact. Interestingly, a news story came out last week about this very subject in the context of a lawsuit featuring Google and the University of Chicago. How would you, as a patient, feel, if your freely given de-identified data was suddenly identifiable by those with whom you did not even know you were sharing it? Probably not so great.
On the other hand, the advent of high-power artificial intelligence, also largely absent in health care back in 2004, now makes your dreams about finding a cure actually possible, provided that the amount and breadth of data available can flow freely to those with the best engines for discovery. How would you, as a patient, feel? Probably optimistic.
In the end, many people with serious health conditions want to know more about themselves and they sincerely want to help advance science. They do not want to accidentally step in harm’s way because technology keeps changing. It is quite a conundrum.
There could be a similar discussion around how the players in the health care system are evolving. Would you have thought about your health care record’s stewardship for a second when you were at your health care provider 15 years ago, hastily signing a HIPAA form? Not really. You go, the doctor writes stuff down with a pen in your file, end of story. Now all records are being digitized, and thus are far more shareable. Additionally, many provider organizations are now merged with payer/insurance organizations and may even be integrated with retail companies or tech companies. Who knew? Not you, but your data cat is already out of the bag.
This might not even matter to people so much if their real fear wasn’t about how their data could be used against them. We currently have laws on the books in the U.S. that prevent insurers from denying you coverage if you have a pre-existing condition. But could that change? Hell, yes it could. It’s the matter of political debate every single day in the U.S. We currently have laws on the books that say you can’t be discriminated against in health insurance coverage or employment based on your genetic profile (i.e., GINA). Could that change? You bet. All it takes is a Congress with a bad day and anything can happen. And none of these laws apply in the context of life insurance. Could they apply in the context of say, abortion as a crime? Could someone in a state where abortion is outlawed look at your data, see that you had one, and arrest you? This seemed extreme just one year ago, but in this current political/social moment in several U.S. states, one never knows.
In an increasingly data-rich world, will there come a day when Internet-based access to the sequencing of your microbiome will cause restaurants not to serve you certain foods for liability reasons? Where does it end? People can’t even comprehend, much less anticipate, the crazy ways we could use people’s data against them if we are being sinister, much as they can’t comprehend all of the good that could come from the free flow of data touched by great researchers and advanced research tools. The possibilities are endless in both directions.
Now, add in the cybersecurity risks we never anticipated and your head could just explode. Providers, payers, retailers…they have all been hacked. What do you do when your personal data is available to anyone with an entrée to the black market and a credit card? How do we think about that in terms of perception, trust and risk?
There is a huge, untapped reservoir of energy in our health care system fueled by patients and the people who love them. Most people want to contribute to the betterment of society, help their fellow humans, and find a way to a healthier life. They are willing to share a piece of themselves to do that. Most organizations who are the natural recipients of health data also have good intentions and want to engage in data-based business transactions that advance medicine for the good of their balance sheets and the good of society. So, the trick is, as always, how to balance doing good with self-protection? How do we balance the advancement of science with the management of risk to our own interests? How do we allow for the former without unintentionally causing harm to ourselves? How do we live out our responsibility to be good corporate and civic citizens when the risk of doing so can, potentially, have significant negative repercussions? How do we recognize and account for the fact that the seismic shift in what health care companies are doing in expanded business models must change how we engage with them and perceive them? How do we and they ensure that the businesses with whom we engage are part of the circle of trust and deservedly so?
These are the Big Issues that we, as an industry, need to stay focused upon, recognizing that the wild cards of market evolution that we cannot foresee are always out there. In the end, we cannot advance science without patients’ involvement and we can’t advance health without collaboration and community. We sincerely hope that the UHG/PLM integration provides an excellent blueprint for doing both.
About the Authors:
Susannah Fox advises companies and organizations on how to navigate the intersection of health and technology. She boosts the signal for peer-to-peer health advice every chance she gets.
Jane Sarasohn-Kahn is a health economist focused on the patient’s challenges as payer and consumer – with the promise of morphing into a health citizen. Jane writes the Health Populi blog. She is the author of the recently-published book, HealthConsuming: From Health Consumer to Health Citizen.
Lisa Suennen is Managing Director at Manatt, Phelps & Philips. She works with exceptional leaders seeking to build value in a transforming health care world. As a strategy consultant and venture capital investor, Lisa focuses on the worlds of innovation, business model transformation and where health care intersects with technology. She is also the author of the Venture Valkyrie blog and co-host of the Tech Tonics Podcast.
Disclosures: All three of the authors provide strategic advice to health care companies and organizations (including payers, providers, retailers, tech companies, emerging companies and others), including some of the companies and organizations mentioned in this article. For details on our specific affiliations, see our bios below:
Susannah Fox Bio and her Blog
Jane Sarasohn-Kahn Bio and her Blog, Health Populi
Lisa Suennen Bio and her Blog, Venture Valkyrie
Kerstin Leuther says
Thanks for the excellent and thoughtful analysis!
Lisa Suennen says
Thanks Kerstin! L
Matthew Holt says
This reminds me of the edit to the entry for the earth in Hitchhikers Guide to the Galaxy. It starts out as “Harmless” but after 5 years of scrutiny gets changed–to “Mostly Harmless”
My initial reaction to the PLM/UNH deal was “complicated.” Having read the 3/5 of the Ladies who Dine article on the topic, my reaction is now, “extremely complicated”
Lisa Suennen says
Matthew – for once we agree 🙂
Ramin Bastani says
An important article at an important time.
I’ve watched/admired PLM since getting into the health + tech space a long time ago…and I hope they are able to continue being a force of good inside of the United Health Group.
PS – My favorite sentence…
“In the end it boils down to this: people want to decide their own fates, want a right to help others or not help others, want a sense of control over those aspects of their lives which are most personal, and want to exercise their right against self-incrimination (such as in the risk of discrimination by malevolent use of your own data against you), even while sharing information.”
Lisa Suennen says
Thank you for another great post, this time on trust. I’ve been a big fan of Susannah and first time reader of Jane’s writing. Nice collaboration!
A very timely post, in part also because the recent new launch of a cyber currency initiative by one of the largest social media platforms (Facebook) where ‘a matter of trust’ has become an international challenge.
As with all companies attempting to establish and retain trust, looking at their histories is often not re-assuring. And even those with very little history, such as the blockchain ecosystem companies, their behaviors are not re-assuring.
Add to that varying degrees of anonymization intrinsic to their model, Regan’s warning “Trust but Verfiy” becomes another place where trust breaks down.
I recently lost my best friend to pancreatic cancer. This has caused me to think long and hard about the promise of patient-provided medical and personal health records.
These mega-acquisitions make explicit or implicit claims the Descriptive information from Big Data will lead to advances in Predictive and Prescriptive data. That’s only true if the needed information is in the data.
Please see Slide 6, here
for a concrete illustration.
“Donate your data to science” will improve our collective predictive information, where we currently have a poverty of high quality descriptive data. With regard to curing pancreatic cancer and other prescriptive applications, we need to be more explicit about how likely the information content will help us find the cure. Even with more “comprehensive data sets” as referenced in your post.
Failure to do so is untrustworthy; it is dishonest, and overselling that hurts patients, families, institutions, industries and science. There are some very fundamental and knowable things about pancreatic cancers that, even with “staggering amounts of existing data”, we will never know. We could and should. I haven’t heard that as a marketing message for any of these companies (I have carefully read and listened to many pitches.)