Optum spent the last decade investing in significant growth, adding thousands of physicians to its network and purchasing ASC company Surgical Care Affiliates in 2017. Now the company is focusing more on its primary care network, data offerings and $115 billion pharmacy and medical care business line.
Optum Health, the healthcare provider division which includes physician groups and ambulatory surgery centers
Optum Insight, which houses the data analytics platforms designed to connect clinical, administrative and financial data
Optum Rx, a pharmacy benefits and care services business
Each division has a unique growth strategy focused on the patient and provider experience.
Provider growth Optum Health now has 60,000 employed or aligned physicians and partners with 100 payers. Optum as a whole now works with 80 percent of health plans and 90 percent of hospitals and 90 percent of Fortune 100 companies.
Wyatt Decker, CEO of Optum Health, said in the 2021 earnings call in January that Optum Health is still a growth platform and the company will make investments to more deeply penetrate established markets. He expects Optum Health to deliver 8 percent to 10 percent margins annually going forward.
“Our approach strengthens the critical provider-patient relationship by empowering our primary care physicians with the latest information, insights and best practices to help them efficiently coordinate all patient care, manage referrals and identify higher-quality, lower-cost options,” the company said in its 2021 yearend highlights report.
ASCs certainly fit the high quality, low-cost care description, but Optum’s executives fell shy of mentioning whether the company would focus on growth in that sector during the 2021 earnings call.
Optum is also expanding its virtual care capabilities, focused on chronic care patients, and its behavioral health services. The company said it needs to add physicians, clinicians and technology to support patient care in those areas. Optum said it has its sights set on providing more whole-person, value-based care, scaling in new markets and having the key data insights to do it better than anyone else.
Value-based care Last year, Optum and UnitedHealth Group’s health insurance business, UnitedHealthcare, worked together with external partners to grow in commercial and government payer markets, innovate and add 500,000 patients to their value-based contracts. Optum served 100 million patients, and 2 million of the patients were under fully accountable arrangements. Both companies also had a sharpened focus on the consumer experience.
“Taken together, these efforts helped us add more than $30 billion in revenue for the year, about $10 billion above our initial outlook,” said Andrew Witty, CEO of UnitedHealth Group, during the earnings call, as transcribed by The Motley Fool. “And you should expect similar growth in the year ahead. We see an even greater demand for integration to bring together the fragmented pieces of the health system, to harness the tremendous innovation occurring in the marketplace, to help better align the incentives for providers, payers and consumers, and to organize the system around value.”
Data and information Optum Insights aims to continue growing by acquiring Change Healthcare, a healthcare data and technology company.
“The combination will advance our ability to create products and services that improve the delivery of healthcare and reduce the high costs and inefficiencies that plague the health system,” Optum claimed on its fact sheet about the transaction. “We will share these innovations broadly to benefit those who engage with the health system today and well into the future.”
The acquisition could make the episode of care more seamless for patients and reduce administrative burden for providers, as well as give payers a comprehensive view of the patient’s health outcomes with the potential to reduce cost. But it could also give Optum and UnitedHealth Group an unfair advantage over competitors, the Justice Department argued in a lawsuit filed in February.
“Across Optum, we operate with the highest ethical standards in protecting confidential data and information of our clients and adhere to the safeguards we have had in place for more than a decade to ensure data is accessed and used only for permissible purposes,” according to a statement on Optum’s website responding to the Justice Department’s lawsuit. “We will not be distracted by the DOJ’s complaint and will continue to honorably serve our clients and consumers and those that engage in the health system.”
For years, pioneering CFOs steadily extended their duties beyond the boundaries of the traditional finance and accounting function. Over the past year, an expanding set of beyond-finance activities – including those related to environmental, social and governance (ESG) matters; human capital reporting; cybersecurity; and supply chain management – have grown in importance for most finance groups. Traditional finance and accounting responsibilities remain core requirements for CFOs, even as they augment planning, analysis, forecasting and reporting processes to thrive in the cloud-based digital era. Protiviti’s latest global survey of CFOs and finance leaders shows that CFOs are refining their new and growing roles by addressing five key areas:
Accessing new data to drive success – The ability of CFOs and finance groups to address their expanding priorities depends on the quality and completeness of the data they access, secure, govern and use. Even the most powerful, cutting-edge tools will deliver subpar insights without optimal data inputs. In addition, more of the data finance uses to generate forward-looking business insights is sourced from producers outside of finance group and the organization. Many of these data producers lack expertise in disclosure controls and therefore need guidance from the finance organization.
Developing long-term strategies for protecting and leveraging data – From a data-protection perspective, CFOs are refining their calculations of cyber risk while benchmarking their organization’s data security and privacy spending and allocations. From a data-leveraging perspective, finance chiefs are creating and updating roadmaps for investments in robotic process automation, business intelligence tools, AI applications, other types of advanced automation, and the cloud technology that serves as a foundational enabler for these advanced finance tools. These investments are designed to satisfy the need for real-time finance insights and analysis among a mushrooming set of internal customers.
Applying financial expertise to ESG reporting – CFOs are mobilizing their team’s financial reporting expertise to address unfolding Human Capital and ESG reporting and disclosure requirements. Leading CFOs are consummating their role in this next-generation data collection activity while ensuring that the organization lays the groundwork to maximize the business value it derives from monitoring, managing and reporting all forms of ESG-related performance metrics.
Elevating and expanding forecasting – Finance groups are overhauling forecasting and planning processes to integrate new data inputs, from new sources, so that the insights the finance organization produces are more real-time in nature and relevant to more finance customers inside and outside the organization. Traditional key performance indicators (KPIs) are being supplemented by key business indicators (KBIs) to provide sharper forecasts and viewpoints. As major new sources of political, social, technological and business volatility arise in an unsteady post-COVID era, forecasting’s value to the organization continues to soar.
Investing in long-term talent strategies – Finance groups are refining their labor model to become more flexible and gain long-term access to cutting-edge skills and innovative thinking in the face of an ongoing and persistent finance and accounting talent crunch. CFOs also are recalibrating their flexible labor models and helping other parts of the organization develop a similar approach to ensure the entire future organization can skill and scale to operate at the right size and in the right manner.
An argument for humility in the face of pandemic forecasting unknown unknowns.
“Are we battling an unprecedented pandemic or panicking at a computer generated mirage?” I asked at the beginning of the COVID-19 pandemic on March 18, 2020. Back then the Imperial College London epidemiological model’s baseline scenario projected that with no changes in individual behaviors and no public health interventions, more than 80 percent of Americans would eventually be infected with novel coronavirus and about 2.2 million would die of the disease. This implies that 0.8 percent of those infected would die of the disease. This is about 8-times worse than the mortality rate from seasonal flu outbreaks.
Spooked by these dire projections, President Donald Trump issued on March 16 his Coronavirus Guidelines for America that urged Americans to “listen to and follow the directions of STATE AND LOCAL AUTHORITIES.” Among other things, Trump’s guidelines pressed people to “work or engage in schooling FROM HOME whenever possible” and “AVOID SOCIAL GATHERINGS in groups of more than 10 people.” The guidelines exhorted Americans to “AVOID DISCRETIONARY TRAVEL, shopping trips and social visits,” and that “in states with evidence of community transmission, bars, restaurants, food courts, gyms, and other indoor and outdoor venues where people congregate should be closed.”
Let’s take a moment to recognize just how blindly through the early stages of the pandemic we—definitely including our public health officials—were all flying at the time. The guidelines advised people to frequently wash their hands, disinfect surfaces, and avoid touching their faces. Basically, these were the sort of precautions typically recommended for influenza outbreaks. On July 9, 2020, an open letter from 239 researchers begged the World Health Organization and other public health authorities to recognize that COVID-19 was chiefly spread by airborne transmission rather than via droplets deposited on surfaces. The U.S. Centers for Disease Control and Prevention (CDC) didn’t update its guidance on COVID-19 airborne transmission until May 2021. And it turns out that touching surfaces is not a major mode of transmission for COVID-19.
The president’s guidelines also advised, “IF YOU FEEL SICK, stay home. Do not go to work.” This sensible advice, however, missed the fact that a huge proportion of COVID-19 viral transmission occurred from people without symptoms. That is, people who feel fine can still be infected and, unsuspectingly, pass along their virus to others. For example, one January 2021 study estimated that “59% of all transmission came from asymptomatic transmission, comprising 35% from presymptomatic individuals and 24% from individuals who never develop symptoms.”
The Imperial College London’s alarming projections did not go uncontested. A group of researchers led by Stanford University medical professor Jay Bhattacharya believed that COVID-19 infections were much more widespread than the reported cases indicated. If the Imperial College London’s hypothesis were true, Bhattacharya and his fellow researchers argued, that would mean that the mortality rate and projected deaths from the coronavirus would be much lower, making the pandemic much less menacing.
The researchers’ strategy was to blood test people in Santa Clara and Los Angeles Counties in California to see how many had already developed antibodies in response to coronavirus infections. Using those data, they then extrapolated what proportion of county residents had already been exposed to and recovered from the virus.
Bhattacharya and his colleagues preliminarily estimated that between 48,000 and 81,000 people had already been infected in Santa Clara County by early April, which would mean that COVID-19 infections were “50-85-fold more than the number of confirmed cases.” Based on these data the researchers calculated that toward the end of April “a hundred deaths out of 48,000-81,000 infections corresponds to an infection fatality rate of 0.12-0.2%.” As I optimistically reported at the time, that would imply that COVID-19’s lethality was not much different than for seasonal influenza.
Bhattacharya and his colleagues conducted a similar antibody survey in Los Angeles County. That study similarly asserted that COVID-19 infections were much more widespread than reported cases. The study estimated 2.8 to 5.6 percent of the residents of Los Angeles County had been infected by early April. That translates to approximately 221,000 to 442,000 adults in the county who have had the infection. “That estimate is 28 to 55 times higher than the 7,994 confirmed cases of COVID-19 reported to the county by the time of the study in early April,” noted the accompanying press release. “The number of COVID-related deaths in the county has now surpassed 600.” These estimates would imply a relatively low infection fatality rate of between 0.14 and 0.27 percent.
Unfortunately, from the vantage of 14 months, those hopeful results have not been borne out. Santa Clara County public health officials report that there have been 119,712 diagnosed cases of COVID-19 so far. If infections were really being underreported by 50-fold, that would suggest that roughly 6 million Santa Clara residents would by now have been infected by the coronavirus. The population of the county is just under 2 million. Alternatively, extrapolating a 50-fold undercount would imply that when 40,000 diagnosed cases were reported on July 11, 2020, all 2 million people living in Santa Clara County had been infected by that date.
Los Angeles County reports 1,247,742 diagnosed COVID-19 cases cumulatively. Again, if infections were really being underreported 28-fold, that would imply that roughly 35 million Angelenos out of a population of just over 10 million would have been infected with the virus by now. Again turning the 28-fold estimate on its head, that would imply that all 10 million Angelenos would have been infected when 360,000 cases had been diagnosed on November 21, 2020.
COVID-19 cases are, of course, being undercounted. Data scientist Youyang Gu has been consistently more accurate than many of the other researchers parsing COVID-19 pandemic trends. Gu estimates that over the course of the pandemic, U.S. COVID-19 infections have roughly been 4-fold greater than diagnosed cases. Applying that factor to the number of reported COVID-19 cases would yield an estimate of 480,000 and 5,000,000 total infections in Santa Clara and Los Angeles respectively. If those are ballpark accurate, that would mean that the COVID-19 infection fatality rate in Santa Clara is 0.46 percent and is 0.49 percent in Los Angeles. Again, applying a 4-fold multiplier to take account of undercounted infections, those are both just about where the U.S. infection fatality rate of 0.45 percent is now.
The upshot is that, so far, we have ended up about half-way between the best case and worst case scenarios sketched out at the beginning of the pandemic.
We rarely see the impact of policies reflected in data in real time. The COVID-19 pandemic changed that. In the present moment, a range of government, private, and academic sources catalogue household-level health and economic information to enable rapid policy analysis and response. To continue promoting periodic findings, identifying vulnerable populations, and maintaining a focus on public health, frequent national data collection needs to be improved and expanded permanently.
Knowledge accumulates over time, facilitating new advancements and advocacy. While mRNA biotechnology was not usable decades ago, years of public research helped unlock highly effective COVID-19 vaccines. The same can be true for advancing effective socioeconomic policies. More national, standardized data like the Census Bureau’s Household Pulse Survey will accelerate progress. At the same time, there are significant issues with national data sources. For instance, COVID-19 data reported by the CDC faced notable quality issues and inconsistencies between states.
Policymakers can’t address problems that they don’t know exist. Researchers can’t identify problems and solutions without adequate data. We can better study how policies impact population health and inform legislative action with greater federal funding dedicated to wide-ranging, systematized population surveys.
Broader data collection enables more findings and policy development
Evidence-based research is at the core of effective policy action. Surveillance data indicates what problems families face, who is most affected, and which interventions can best promote health and economic well-being. These collections can inform policy responses by reporting information on the demographics disproportionately affected by socioeconomic disruptions. Race and ethnicity, age, gender, sexual orientation, household composition, and work occupation all provide valuable details on who has been left behind by past and present legislative choices.
Since March 2020, COVID-19 cases and deaths, changes in employment, and food and housing security have been tracked periodically with detailed demographic information through surveys like the Both cumulative statistical compilations and representative surveillance polling have been instrumental to analyses. Our team has recorded over 200 state-level policies in the COVID-19 US State Policy (CUSP) database to further research and journalistic investigations. We have learned a number of policy lessons, from the health protections of eviction moratoria to the food security benefits of social insurance expansions. Not to be forgotten is the importance of documented evidence to these insights.
Without this comprehensive tracking, it would be difficult to determine the number of evictions occurring despite active moratoria, what factors contribute to elevated risk of COVID-19, and the value of pandemic unemployment insurance programs in states. The wider number of direct and indirect health outcomes measured have bolstered our understanding of the suffering experienced by different demographic groups. These issues are receiving legislative attention, in no small part due to the broad statistical collection and subsequent analytical research on these topics.
Insufficient data results in inadequate understanding of policy issues
The more high-quality data there is, the better. With the state-level policies present in CUSP, our team and other research groups quantified the impact of larger unemployment insurance benefit sizes, greater minimum wages, mask mandates, and eviction freezes. These analyses have been utilized by state and federal officials. None would have been possible without increased data collection.
However, our policy investigations are constrained by the data availability and quality on state and federal government websites, which may be improved with stimulus funds allocated to modernize our public health data infrastructure. Some of the most consequential decision-making right now relates to vaccine distribution and administration, but it is difficult to disaggregate state-level statistics. Many states lack demographic information on vaccine recipients as well as those that have contracted or died from COVID-19. Even though racial disparities are present in COVID-19 cases, hospitalizations, and deaths nationally, we can’t always determine the extent of these inequities locally. These present issues are a microcosm of pre-existing problems.
Data shortcomings present for years, in areas like occupational safety, are finally being spotlighted due to the pandemic. Minimal national and state workplace health data translated to insufficient COVID-19 surveillance in workplace settings. Studies that show essential workers are facing elevated risk of COVID-19 are often limited in scope to individual states or cities, largely due to the lack of usable and accessible data. More investment is needed going forward beyond the pandemic to better document a Otherwise there will continue to be serious blind spots in the ability to evaluate policy decisions, enforce better workplace standards, and hold leaders accountable for choices.
These are problems with a simple solution: collect more information. Now is not the time to eliminate valuable community surveys and aggregate compilations, but to expand on them. More comprehensive data will provide a spotlight on current and future legislative choices and improve the understanding of policies in new ways. It is our hope that are built upon and become the new norm.
Disclosure: Funding received from Robert Wood Johnson Foundation was used to develop the COVID-19 US State Policy Database.
Optum, a subsidiary of UnitedHealth, provides data analytics and infrastructure, a pharmacy benefit manager called OptumRx, a bank providing patient loans called Optum Bank, and more.
It’s not often that the American Hospital Association—known for fun lobbying tricks like hiring consultants to create studies showing the benefits of hospital mergers—directly goes after another consolidation in the industry.
But when the AHA caught wind of UnitedHealth Group subsidiary Optum’s plans, announced in January 2021, to acquire data analytics firm Change Healthcare, they offered up some fiery language in a letter to the Justice Department. “The acquisition … will concentrate an immense volume of competitively sensitive data in the hands of the most powerful health insurance company in the United States, with substantial clinical provider and health insurance assets, and ultimately removes a neutral intermediary.”
If permitted to go through, Optum’s acquisition of Change would fundamentally alter both the health data landscape and the balance of power in American health care. UnitedHealth, the largest health care corporation in the U.S., would have access to all of its competitors’ business secrets. It would be able to self-preference its own doctors. It would be able to discriminate, racially and geographically, against different groups seeking insurance. None of this will improve public health; all of it will improve the profits of Optum and its corporate parent.
Despite the high stakes, Optum has been successful in keeping this acquisition out of the public eye.Part of this PR success is because few health care players want to openly oppose an entity as large and powerful as UnitedHealth. But perhaps an even larger part is that few fully understand what this acquisition will mean for doctors, patients, and the health care system at large.
If regulators allow the acquisition to take place, Optum will suddenly have access to some of the most secret data in health care.
UnitedHealth is the largest health care entity in the U.S., using several metrics. United Healthcare (the insurance arm) is the largest health insurer in the United States, with over 70 million members, 6,500 hospitals, and 1.4 million physicians and other providers. Optum, a separate subsidiary, provides data analytics and infrastructure, a pharmacy benefit manager called OptumRx, a bank providing patient loans called Optum Bank, and more. Through Optum, UnitedHealth also controls more than 50,000 affiliated physicians, the largest collection of physicians in the country.
While UnitedHealth as a whole has earned a reputation for throwing its weight around the industry, Optum has emerged in recent years as UnitedHealth’s aggressive acquisition arm. Acquisitions of entities as varied as DaVita’s dialysis physicians, MedExpress urgent care, and Advisory Board Company’s consultants have already changed the health care landscape. As Optum gobbles up competitors, customers, and suppliers, it has turned into UnitedHealth’s cash cow, bringing in more than 50 percent of the entity’s annual revenue.
On a recent podcast, Chas Roades and Dr. Lisa Bielamowicz of Gist Healthcare described Optum in a way that sounds eerily similar to a single-payer health care system. “If you think about what Optum is assembling, they are pulling together now the nation’s largest employers of docs, owners of one of the country’s largest ambulatory surgery center chains, the nation’s largest operator of urgent care clinics,” said Bielamowicz. With 98 million customers in 2020, OptumHealth, just one branch of Optum’s services, had eyes on roughly 30 percent of the U.S. population. Optum is, Roades noted, “increasingly the thing that ate American health care.”
Optum has not been shy about its desire to eventually assemble all aspects of a single-payer system under its own roof. “The reason it’s been so hard to make health care and the health-care system work better in the United States is because it’s rare to have patients, providers—especially doctors—payers, and data, all brought together under an organization,” OptumHealth CEO Wyatt Decker told Bloomberg. “That’s the rare combination that we offer. That’s truly a differentiator in the marketplace.” The CEO of UnitedHealth, Andrew Witty, has also expressed the corporation’s goal of “wir[ing] together” all of UnitedHealth’s assets.
Controlling Change Healthcare would get UnitedHealth one step closer to creating their private single-payer system. That’s why UnitedHealth is offering up $13 billion, a 41 percent premium on the public valuation of Change. But here’s why that premium may be worth every penny.
Change Healthcare is Optum’s leading competitor in pre-payment claims integrity; functionally, a middleman service that allows insurers to process provider claims (the receipts from each patient visit) and address any mistakes. To clarify what that looks like in practice, imagine a patient goes to an in-network doctor for an appointment. The doctor performs necessary procedures and uses standardized codes to denote each when filing a claim for reimbursement from the patient’s insurance coverage. The insurer then hires a reviewing service—this is where Change comes in—to check these codes for accuracy. If errors are found in the coded claims, such as accidental duplications or more deliberate up-coding (when a doctor intentionally makes a patient seem sicker than they are), Change will flag them, saving the insurer money.
The most obvious potential outcome of the merger is that the flow of data will allow Optum/UnitedHealth to preference their own entities and physicians above others.
To accurately review the coded claims, Change’s technicians have access to all of their clients’ coverage information, provider claims data, and the negotiated rates that each insurer pays.
Change also provides other services, including handling the actual payments from insurers to physicians, reimbursing for services rendered. In this role, Change has access to all of the data that flows between physicians and insurers and between pharmacies and insurers—both of which give insurers leverage when negotiating contracts. Insurers often send additional suggestions to Change as well; essentially their commercial secrets on how the insurer is uniquely saving money. Acquiring Change could allow Optum to see all of this.
Change’s scale (and its independence from payers) has been a selling point; just in the last few months of 2020, the corporation signed multiple contracts with the largest payers in the country.
Optum is not an independent entity; as mentioned above, it’s owned by the largest insurer in the U.S. So, when insurers are choosing between the only two claims editors that can perform at scale and in real time, there is a clear incentive to use Change, the independent reviewer, over Optum, a direct competitor.
If regulators allow the acquisition to take place, Optum will suddenly have access to some of the most secret data in health care. In other words, if the acquisition proceeds and Change is owned by UnitedHealth, the largest health care corporation in the U.S. will own the ability to peek into the book of business for every insurer in the country.
Although UnitedHealth and Optum claim to be separate entities with firewalls that safeguard against anti-competitive information sharing, the porosity of the firewall is an open question. As the AHA pointed out in their letter to the DOJ, “[UnitedHealth] has never demonstrated that the firewalls are sufficiently robust to prevent sensitive and strategic information sharing.”
In some cases, this “firewall” would mean asking Optum employees to forget their work for UnitedHealth’s competitors when they turn to work on implementing changes for UnitedHealth. It is unlikely to work. And that is almost certainly Optum’s intention.
The most obvious potential outcome of the merger is that the flow of data will allow Optum/UnitedHealth to preference their own entities and physicians above others. This means that doctors (and someday, perhaps, hospitals) owned by the corporation will get better rates, funded by increased premiums on patients. Optum drugs might seem cheaper, Optum care better covered. Meanwhile, health care costs will continue to rise as UnitedHealth fuels executive salaries and stock buybacks.
UnitedHealth has already been accused of self-preferencing. A large group of anesthesiologists filed suit in two states last week, accusing the company of using perks to steer surgeons into using service providers within its networks.
Even if UnitedHealth doesn’t purposely use data to discriminate, the corporation has been unable to correct for racially biased data in the past.
Beyond this obvious risk, the data alterations caused by the Change acquisition could worsen existing discrimination and medical racism. Prior to the acquisition, Change launched a geo-demographic analytics unit. Now, UnitedHealth will have access to that data, even as it sells insurance to different demographic categories and geographic areas.
Even if UnitedHealth doesn’t purposely use data to discriminate, the corporation has been unable to correct for racially biased data in the past, and there’s no reason to expect it to do so in the future. A study published in 2019 found that Optum used a racially biased algorithm that could have led to undertreating Black patients. This is a problem for all algorithms. As data scientist Cathy O’Neil told 52 Insights, “if you have a historically biased data set and you trained a new algorithm to use that data set, it would just pick up the patterns.” But Optum’s size and centrality in American health care would give any racially biased algorithms an outsized impact. And antitrust lawyer Maurice Stucke noted in an interview that using racially biased data could be financially lucrative. “With this data, you can get people to buy things they wouldn’t otherwise purchase at the highest price they are willing to pay … when there are often fewer options in their community, the poor are often charged a higher price.”
The fragmentation of American health care has kept Big Data from being fully harnessed as it is in other industries, like online commerce. But Optum’s acquisition of Change heralds the end of that status quo and the emergence of a new “Big Tech” of health care. With the Change data, Optum/UnitedHealth will own the data, providers, and the network through which people receive care. It’s not a stretch to see an analogy to Amazon, and how that corporation uses data from its platform to undercut third parties while keeping all its consumers in a panopticon of data.
The next step is up to the Department of Justice, which has jurisdiction over the acquisition (through an informal agreement, the DOJ monitors health insurance and other industries, while the FTC handles hospital mergers, pharmaceuticals, and more). The longer the review takes, the more likely it is that the public starts to realize that, as Dartmouth health policy professor Dr. Elliott Fisher said, “the harms are likely to outweigh the benefits.”
There are signs that the DOJ knows that to approve this acquisition is to approve a new era of vertical integration. In a document filed on March 24, Change informed the SEC that the DOJ had requested more information and extended its initial 30-day review period. But the stakes are high. If the acquisition is approved, we face a future in which UnitedHealth/Optum is undoubtedly “the thing that ate American health care.”
“I don’t think we have good enough information to show how we should be deploying telemedicine,” a physician leader recently told us. “If we can’t show that a virtual visit can adequately substitute for an in-person visit, then we should be focusing on making sure patients know it’s safe to come in.” It struck us that viewing telemedicine as a direct substitute for an office visit was a narrow and antiquated way to think about virtual care.
Moreover, the argument that telemedicine visits are potentially cost-increasing if they are “additive” to other care interactions, rather than “substitutive”, is rooted in fee-for-service payment: more patient-provider interactions equals more billable visits, and with more visits, we run the risk of increasing costs.
Telemedicine (both video and phone visits) likelytaps into pent-up demand for accessby patients who would otherwise not seek care. Some patients could be aided by more frequent, brief encounters; this is considered a failure only when viewed through the lens of fee-for-service payment. (Honestly, with primary care accounting for less than 6 percent of total healthcare spending, it’s hard to argue that additional telemedicine visits will be responsible for supercharging the cost of care.) Of course, there are many clinical situations in which in-person interaction—to perform a physical exam, measure vitals, observe a patient—is fundamental. Patients know this, and understand that sometimes they’ll need to be seen in person. But hopefully that next encounter will be more efficient, having already covered the basics.
The ideal care model will look different for different patients, and different kinds of clinical problems—but will likely be a blend of both virtual and in-person interactions, maximizing communication, information-gathering, and patient convenience.
Fourteen of the nation’s largest health systems announced this week that they have joined together to form a new, for-profit data company aimed at aggregating and mining their clinical data. Called Truveta, the company will draw on the de-identified health records of millions of patients from thousands of care sites across 40 states, allowing researchers, physicians, biopharma companies, and others to draw insights aimed at “improving the lives of those they serve.”
Health system participants include the multi-state Catholic systems CommonSpirit Health, Trinity Health, Providence, and Bon Secours Mercy, the for-profit system Tenet Healthcare, and a number of regional systems. The new company will be led by former Microsoft executive Terry Myerson, who has been working on the project since March of last year. As large technology companies like Amazon and Google continue to build out healthcare offerings, and national insurers like UnitedHealth Group and Aetna continue to grow their analytical capabilities based on physician, hospital, and pharmacy encounters, it’s surprising that hospital systems are only now mobilizing in a concerted way to monetize the clinical data they generate.
Like Civica, an earlier health system collaboration around pharmaceutical manufacturing, Truveta’s launch signals that large national and regional systems are waking up to the value of scale they’ve amassed over time, moving beyond pricing leverage to capture other benefits from the size of their clinical operations—and exploring non-merger partnerships to create value from collaboration. There will inevitably be questions about how patient data is used by Truveta and its eventual customers, but we believe the venture holds real promise for harnessing the power of massive clinical datasets to drive improvement in how care is delivered.
Since the beginning of the coronavirus pandemic, Florida has blocked, obscured, delayed, and at times hidden the COVID-19 data used in making big decisions such as reopening schools and businesses.
And with scientists warning Thanksgiving gatherings could cause an explosion of infections, the shortcomings in the state’s viral reporting have yet to be fixed.
While the state has put out an enormous amount of information, some of its actions have raised concerns among researchers that state officials are being less than transparent.
It started even before the pandemic became a daily concern for millions of residents. Nearly 175 patients tested positive for the disease in January and February, evidence the Florida Department of Health collected but never acknowledged or explained. The state fired its nationally praised chief data manager, she says in a whistleblower lawsuit, after she refused to manipulate data to support premature reopening. The state said she was fired for not following orders.
The health department used to publish coronavirus statistics twice a day before changing to once a day, consistently meeting an 11 a.m. daily deadline for releasing new information that scientists, the media and the public could use to follow the pandemic’s latest twists.
But in the past month the department has routinely and inexplicably failed to meet its own deadline by as much as six hours. On one day in October, it published no update at all.
News outlets were forced to sue the state before it would publish information identifying the number of infections and deaths at individual nursing homes.
Throughout it all, the state has kept up with the rapidly spreading virus by publishing daily updates of the numbers of cases, deaths and hospitalizations.
“Florida makes a lot of data available that is a lot of use in tracking the pandemic,” University of South Florida epidemiologist Jason Salemi said. “They’re one of the only states, if not the only state, that releases daily case line data (showing age, sex and county for each infected person).”
Dr. Terry Adirim, chairwoman of Florida Atlantic University’s Department of Integrated Biomedical Science, agreed, to a point.
“The good side is they do have daily spreadsheets,” Adirim said. “However, it’s the data that they want to put out.”
The state leaves out crucial information that could help the public better understand who the virus is hurting and where it is spreading, Adirim said.
The department, under state Surgeon General Dr. Scott Rivkees, oversees 53? health agencies covering Florida’s 67 counties, such as the one in Palm Beach County headed by Dr. Alina Alonso.
Rivkees was appointed in April 2019. He reports to Gov. Ron DeSantis, a Republican who has supported President Donald Trump’s approach to fighting the coronavirus and pressured local officials to reopen schools and businesses despite a series of spikes indicating rapid spread of the disease.
At several points, the DeSantis administration muzzled local health directors, such as when it told them not to advise school boards on reopening campuses.
DOH Knew Virus Here Since January
The health department’s own coronavirus reports indicated that the pathogen had been infecting Floridians since January, yet health officials never informed the public about it and they did not publicly acknowledge it even after The Palm Beach Post first reported it in May.
In fact, the night before The Post broke the story, the department inexplicably removed from public view the state’s dataset that provided the evidence. Mixed among listings of thousands of cases was evidence that up to 171 people ages 4 to 91 had tested positive for COVID-19 in the months before officials announced in March the disease’s presence in the state.
Were the media reports on the meaning of those 171 cases in error? The state has never said.
No Testing Stats Initially
When positive tests were finally acknowledged in March, all tests had to be confirmed by federal health officials. But Florida health officials refused to even acknowledge how many people in each county had been tested.
State health officials and DeSantis claimed they had to withhold the information to protect patient privacy, but they provided no evidence that stating the number of people tested would reveal personal information.
At the same time, the director of the Hillsborough County branch of the state health department publicly revealed that information to Hillsborough County commissioners.
And during March the state published on a website that wasn’t promoted to the public the ages and genders of those who had been confirmed to be carrying the disease, along with the counties where they claimed residence.
Firing Coronavirus Data Chief
In May, with the media asking about data that revealed the earlier onset of the disease, internal emails show that a department manager ordered the state’s coronavirus data chief to yank the information off the web, even though it had been online for months.
A health department tech supervisor told data manager Rebekah Jones on May 5 to take down the dataset. Jones replied in an email that was the “wrong call,” but complied, only to be ordered an hour later to put it back.
That day, she emailed reporters and researchers following a listserv she created, saying she had been removed from handling coronavirus data because she refused to manipulate datasets to justify DeSantis’ push to begin reopening businesses and public places.
Two weeks later, the health department fired Jones, who in March had created and maintained Florida’s one-stop coronavirus dashboard, which had been viewed by millions of people, and had been praised nationally, including by White House Coronavirus Task Force Coordinator Deborah Birx.
The dashboard allows viewers to explore the total number of coronavirus cases, deaths, tests and other information statewide and by county and across age groups and genders.
DeSantis claimed on May 21 that Jones wanted to upload bad coronavirus data to the state’s website. To further attempt to discredit her, he brought up stalking charges made against her by an ex-lover, stemming from a blog post she wrote, that led to two misdemeanor charges.
Using her technical know-how, Jones launched a competing COVID-19 dashboard website, FloridaCOVIDAction.com in early June. After national media covered Jones’ firing and website launch, people donated more than $200,000 to her through GoFundMe to help pay her bills and maintain the website.
People view her site more than 1 million times a day, she said. The website features the same type of data the state’s dashboard displays, but also includes information not present on the state’s site such as a listing of testing sites and their contact information.
Jones also helped launch TheCOVIDMonitor.com to collect reports of infections in schools across the country.
Jones filed a whistleblower complaint against the state in July, accusing managers of retaliating against her for refusing to change the data to make the coronavirus situation look better.
“The Florida Department of Health needs a data auditor not affiliated with the governor’s office because they cannot be trusted,” Jones said Friday.
Florida Hides Death Details
When coronavirus kills someone, their county’s medical examiner’s office logs their name, age, ethnicity and other information, and sends it to the Florida Department of Law Enforcement.
During March and April, the department refused requests to release that information to the public, even though medical examiners in Florida always have made it public under state law. Many county medical examiners, acknowledging the role that public information can play in combating a pandemic, released the information without dispute.
But it took legal pressure from news outlets, including The Post, before FDLE agreed to release the records it collected from local medical examiners.
When FDLE finally published the document on May 6, it blacked out or excluded crucial information such as each victim’s name or cause of death.
But FDLE’s attempt to obscure some of that information failed when, upon closer examination, the seemingly redacted details could in fact be read by common computer software.
Outlets such as Gannett, which owns The Post, and The New York Times, extracted the data invisible to the naked eye and reported in detail what the state redacted, such as the details on how each patient died.
Reluctantly Revealing Elder Care Deaths, Hospitalizations
It took a lawsuit against the state filed by the Miami Herald, joined by The Post and other news outlets, before the health department began publishing the names of long-term care facilities with the numbers of coronavirus cases and deaths.
The publication provided the only official source for family members to find out how many people had died of COVID-19 at the long-term care facility housing their loved ones.
While the state agreed to publish the information weekly, it has failed to publish several times and as of Nov. 24 had not updated the information since Nov. 6.
It took more pressure from Florida news outlets to pry from the state government the number of beds in each hospital being occupied by coronavirus patients, a key indicator of the disease’s spread, DeSantis said.
That was one issue where USF’s Salemi publicly criticized Florida.
“They were one of the last three states to release that information,” he said. “That to me is a problem because it is a key indicator.”
Confusion Over Positivity Rate
One metric DeSantis touted to justify his decision in May to begin reopening Florida’s economy was the so-called positivity rate, which is the share of tests reported each day with positive results.
But Florida’s daily figures contrasted sharply with calculations made by Johns Hopkins University, prompting a South Florida Sun-Sentinel examination that showed Florida’s methodology underestimated the positivity rate.
The state counts people who have tested positive only once, but counts every negative test a person receives until they test positive, so that there are many more negative tests for every positive one.
John Hopkins University, on the other hand, calculated Florida’s positivity rate by comparing the number of people testing positive with the total number of people who got tested for the first time.
By John Hopkins’ measure, between 10 and 11 percent of Florida’s tests in October came up positive, compared to the state’s reported rate of between 4 and 5 percent.
Health experts such as those at the World Health Organization have said a state’s positivity rate should stay below 5 percent for 14 days straight before it considers the virus under control and go forward with reopening public places and businesses. It’s also an important measure for travelers, who may be required to quarantine if they enter a state with a high positivity rate.
Withholding Detail on Race, Ethnicity
The Post reported in June that the share of tests taken by Black and Hispanic people and in majority minority ZIP codes were twice as likely to come back positive compared to tests conducted on white people and in majority white ZIP codes.
That was based on a Post analysis of internal state data the health department will not share with the public.
The state publishes bar charts showing general racial breakdowns but not for each infected person.
If it wanted to, Florida’s health department could publish detailed data that would shed light on the infection rates among each race and ethnicity or each age group, as well as which neighborhoods are seeing high rates of contagion.
Researchers have been trying to obtain this data but “the state won’t release the data without (making us) undergo an arduous data use agreement application process with no guarantee of release of the data,” Adirim said. Researchers must read and sign a 26-page, nearly 5,700-word agreement before getting a chance at seeing the raw data.
While Florida publishes the ages, genders and counties of residence for each infected person, “there’s no identification for race or ethnicity, no ZIP code or city of the residence of the patient,” Adirim said. “No line item count of negative test data so it’s hard to do your own calculation of test positivity.”
While Florida doesn’t explain its reasoning, one fear of releasing such information is the risk of identifying patients, particularly in tiny, non-diverse counties.
Confusion Over Lab Results
Florida’s daily report shows how many positive results come from each laboratory statewide. Except when it doesn’t.
The report has shown for months that 100 percent of COVID-19 tests conducted by some labs have come back positive despite those labs saying that shouldn’t be the case.
While the department reported in July that all 410 results from a Lee County lab were positive, a lab spokesman told The Post the lab had conducted roughly 30,000 tests. Other labs expressed the same confusion when informed of the state’s reporting.
The state health department said it would work with labs to fix the error. But even as recently as Tuesday, the state’s daily report showed positive result rates of 100 percent or just under it from some labs, comprising hundreds of tests.
Mistakenly Revealing School Infections
As DeSantis pushed in August for reopening schools and universities for students to attend in-person classes, Florida’s health department published a report showing hundreds of infections could be traced back to schools, before pulling that report from public view.
The health department claimed it published that data by mistake, the Miami Herald reported.
The report showed that COVID-19 had infected nearly 900 students and staffers.
The state resumed school infection reporting in September.
A similar publication of cases at day-care centers appeared online briefly in August only to come down permanently.
After shifting in late April to updating the public just once a day at 11 a.m. instead of twice daily, the state met that deadline on most days until it started to falter in October. Pandemic followers could rely on the predictability.
On Oct. 10, the state published no data at all, not informing the public of a problem until 5 p.m.
The state blamed a private lab for the failure but the next day retracted its statement after the private lab disputed the state’s explanation. No further explanation has been offered.
On Oct. 21, the report came out six hours late.
Since Nov. 3, the 11 a.m. deadline has never been met. Now, late afternoon releases have become the norm.
“They have gotten more sloppy and they have really dragged their feet,” Adirim, the FAU scientist, said.
No spokesperson for the health department has answered questions from The Post to explain the lengthy delays. Alberto Moscoso, the spokesman throughout the pandemic, departed without explanation Nov. 6.
The state’s tardiness can trip up researchers trying to track the pandemic in Florida, Adirim said, because if one misses a late-day update, the department could overwrite it with another update the next morning, eliminating critical information and damaging scientists’ analysis.
Hired Sports Blogger to Analyze Data
As if to show disregard for concerns raised by scientists, the DeSantis administration brought in a new data analyst who bragged online that he is no expert and doesn’t need to be.
Kyle Lamb, an Uber driver and sports blogger, sees his lack of experience as a plus.
“Fact is, I’m not an ‘expert’,” Lamb wrote on a website for a subscribers-only podcast he hosts about the coronavirus. “I also don’t need to be. Experts don’t have all the answers, and we’ve learned that the hard way throughout the entire duration of the global pandemic.”
Much of his coronavirus writings can be found on Twitter, where he has said masks and mandatory quarantines don’t stop the virus’ spread, and that hydroxychloroquine, a drug touted by President Donald Trump but rejected by medical researchers, treats it successfully.
While DeSantis says lockdowns aren’t effective in stopping the spread and refuses to enact a statewide mask mandate, scientists point out that quarantines and masks are extremely effective.
The U.S. Food and Drug Administration has said hydroxychloroquine is unlikely to help and poses greater risk to patients than any potential benefits.
Coronavirus researchers have called Lamb’s views “laughable,” and fellow sports bloggers have said he tends to act like he knows much about a subject in which he knows little, the Miami Herald reported.
DeSantis has yet to explain how and why Lamb was hired, nor has his office released Lamb’s application for the $40,000-a-year job. “We generally do not comment on such entry level hirings,” DeSantis spokesman Fred Piccolo said Tuesday by email.
It could be worse.
Texas health department workers have to manually enter data they read from paper faxes into the state’s coronavirus tracking system, The Texas Tribune has reported. And unlike Florida, Texas doesn’t require local health officials to report viral data to the state in a uniform way that would make it easier and faster to process and report.
It could be better.
In Wisconsin, health officials report the number of cases and deaths down to the neighborhood level. They also plainly report racial and ethnic disparities, which show the disease hits Hispanic residents hardest.
Still, Salemi worries that Florida’s lack of answers can undermine residents’ faith.
“My whole thing is the communication, the transparency,” Salemi said. “Just let us know what’s going on. That can stop people from assuming the worst. Even if you make a big error people are a lot more forgiving, whereas if the only time you’re communicating is when bad things happen … people start to wonder.”