A new report from consulting firm Avalere Health and the nonprofit Physicians Advocacy Institute finds that the pandemic accelerated the rise in physician employment, with nearly 70 percent of doctors now employed by a hospital, insurer or investor-owned entity.
Researchers evaluated shifts to employment in the two-year period between January 2019 and January 2021, finding that 48,400 additional doctors left independent practice to join a health system or other company, with the majority of the change occurring during the pandemic. While 38 percent chose employment by a hospital or health system, the majority of newly employed doctors are now employed by a “corporate entity”, including insurers, disruptors and investor-owned companies.
(Researchers said they were unable to accurately break down corporate employers by entity, and that the study likely undercounts the number of physician practices owned by private equity firms, given the lack of transparency in that segment.) Growth rates in the corporate sector dwarfed health system employment, increasing a whopping 38 percent over the past two years, in comparison to a 5 percent increase for hospitals.
We expect this pace will continue throughout this year and beyond, as practices seek ongoing stability and look to manage the exit of retiring partners, enticed by the outsized offers put on the table by investors and payers.
Medicare Advantage (MA) focused companies, like Oak Street
Health (14x revenues), Cano Health (11x revenues), and Iora
Health (announced sale to One Medical at 7x revenues), reflect
valuation multiples that appear irrational to many market observers. Multiples may be
exuberant, but they are not necessarily irrational.
One reason for high valuations across the healthcare sector is the large pools of capital
from institutional public investors, retail investors and private equity that are seeking
returns higher than the low single digit bond yields currently available. Private equity
alone has hundreds of billions in investable funds seeking opportunities in healthcare.
As a result of this abundance of capital chasing deals, there is a premium attached to the
scarcity of available companies with proven business models and strong growth
Valuations of companies that rely on Medicare and Medicaid reimbursement have
traditionally been discounted for the risk associated with a change in government
reimbursement policy. This “bop the mole” risk reflects the market’s assessment that
when a particular healthcare sector becomes “too profitable,” the risk increases that CMS
will adjust policy and reimbursement rates in that sector to drive down profitability.
However, there appears to be consensus among both political parties that MA is the right
policy to help manage the rise in overall Medicare costs and, thus, incentives for MA
growth can be expected to continue. This factor combined with strong demographic
growth in the overall senior population means investors apply premiums to companies in
the MA space compared to traditional providers.
Large pools of available capital, scarcity value, lower perceived sector risk and overall
growth in the senior population are all factors that drive higher valuations for the MA
disrupters. However, these factors pale in comparison the underlying economic driver
for these companies. Taking full risk for MA enrollees and dramatically reducing hospital
utilization, while improving health status, is core to their business model. These
companies target and often achieve reduced hospital utilization by 30% or more for their
assigned MA enrollees.
In 2019, the average Medicare days per 1,000 in the U.S. was 1,190. With about
$14,700 per Medicare discharge and a 4.5 ALOS, the average cost per Medicare day is
approximately $3,200. At the U.S. average 1,190 Medicare hospital days per thousand,
if MA hospital utilization is decreased by 25%, the net hospital revenue per 1,000 MA
enrollees is reduced by about $960,000. If one of the MA disrupters has, for example, 50,000 MA lives in a market, the
decrease in hospital revenues for that MA population would be about $48 million. This does not include the associated
physician fees and other costs in the care continuum. That same $48 million + in the coffers of the risk-taking MA
disrupters allows them deliver comprehensive array of supportive services including addressing social determinants of health. These services then further reduce utilization and improves overall health status, creating a virtuous circle. This is very profitable.
MA is only the beginning. When successful MA businesses expand beyond MA, and they will, disruption across the
healthcare economy will be profound and painful for the incumbents. The market is rationally exuberant about that
A topic that’s come up in almost every discussion we’ve had with health system executive teams and boards recently is workforce strategy. Beyond the immediate political debate about whether temporary unemployment benefits are exacerbating a shortage of workers, there’s a growing recognition that the healthcare workforce is approaching something that looks like a “perfect storm”.
The workforce is mentally and physically exhausted from the pandemic, which has taken a toll both professionally and personally. Many workers are rethinking their work-life balance equations in the wake of a difficult year, during which working conditions and family responsibilities shifted dramatically. That, along with broader economic inflation, is driving demands for higher wages and a more robust set of benefits.
Meanwhile, many health systems are shifting into cost-cutting mode, due to COVID-related shifts in demand patterns and continued downward pressure on reimbursement rates, forcing a renewed focus on workforce productivity.
These combined forces threaten to create a negative spiral, which could lead to even worse shortages and deteriorating workplace engagement. It’s striking how quickly the “hero” narrative has shifted to a “crisis” narrative, and we agree completely with one health system board member who told us recently that workforce strategy is now the number one issue on his agenda.
No easy answers here, but we’ll continue to report on innovative approaches to addressing these difficult challenges.
Late last week, retail giant Walmart announced its plan to acquire national telemedicine provider MeMD, for an undisclosed sum. According to Dr. Cheryl Pegus, Walmart’s executive vice president for health, the acquisition “complements our brick-and-mortar Walmart Health locations”, allowing the company to “expand access and reach consumers where they are”.
MeMD, founded in 2010, provides primary care and mental health services to five million patients nationally. The acquisition extends Walmart’s health delivery capabilities beyond the handful of in-store and store-adjacent clinics it runs, and follows the launch of its own Medicare Advantage-focused broker business, and partnership with Medicare Advantage start-up Clover Health to offer a co-branded insurance product.
Walmart has been climbing the healthcare learning curve for several years, building on its sizeable retail pharmacy business, and seems to have hit on a successful formula in its latest in-person clinic model, which includes primary care, behavioral health, vision, and dental services. The retailer plans to add 22 new clinic locations by the end of this year, and its new telemedicine offering will allow it to expand its virtual reach even further.
The MeMD acquisition also represents a new front in Walmart’s head-to-head competition with Amazon, which launched its own national telemedicine service earlier this year. That service, Amazon Care, is targeted at the employer market, and right on cue, Amazon announced its first customer sale last week—to Precor, a fitness equipment company.
Both retail giants are slowly circling the $3.6T healthcare industry, targeting inefficiencies by deploying their expertise in convenience and consumer engagement. Incumbents beware.
Federally-subsidized childcare centers took care of an estimated 550,000 to 600,000 children while their mothers worked wartime jobs.
When the United States started recruiting women for World War II factory jobs, there was a reluctance to call stay-at-home mothers with young children into the workforce. That changed when the government realized it needed more wartime laborers in its factories. To allow more women to work, the government began subsidizing childcare for the first (and only) time in the nation’s history.
An estimated 550,000 to 600,000 children received care through these facilities, which cost parents around 50 to 75 cents per child, per day (in 2021, that’s less than $12). But like women’s employment in factories, the day care centers were always meant to be a temporary wartime measure. When the war ended, the government encouraged women to leave the factories and care for their children at home. Despite receiving letters and petitions urging the continuation of the childcare programs, the U.S. government stopped funding them in 1946.
Before World War II, organized “day care” didn’t really exist in the United States. The children of middle- and upper-class families might go to private nursery schools for a few hours a day, says Sonya Michel, a professor emerita of history, women’s studies and American studies at the University of Maryland-College Park and author of Children’s Interests/Mothers’ Rights: The Shaping of America’s Child Care Policy. (In German communities, five- and six-year-olds went to half-day Kindergartens.)
For children from poor families whose father had died or couldn’t work, there were day nurseries funded by charitable donations, Michel says. But there were no affordable all-day childcare centers for families in which both parents worked—a situation that was common for low-income families, particularly Black families, and less common for middle- and upper-class families.
The war temporarily changed that. In 1940, the United States passed the Defense Housing and Community Facilities and Services Act, known as the Lanham Act, which gave the Federal Works Agency the authority to fund the construction of houses, schools and other infrastructure for laborers in the growing defense industry. It was not specifically meant to fund childcare, but in late 1942, the government used it to fund temporary day care centers for the children of mothers working wartime jobs.
Communities had to apply for funding to set up day care centers; once they did, there was very little federal involvement. Local organizers structured childcare centers around a community’s needs. Many offered care at odd hours to accommodate the schedules of women who had to work early in the morning or late at night. They also provided up to three meals a day for children, with some offering prepared meals for mothers to take with them when they picked up their kids.
“The ones that we often hear about were the ‘model’ day nurseries that were set up at airplane factories [on the West coast],” says Michel. “Those were ones where the federal funding came very quickly, and some of the leading voices in the early childhood education movement…became quickly involved in setting [them] up,” she says.
For these centers, organizers enlisted architects to build attractive buildings that would cater to the needs of childcare, specifically. “There was a lot of publicity about those, but those were unusual. Most of the childcare centers were kind of makeshift. They were set up in church basements or garages.”
Though the quality of care varied by center, there hasn’t been much study of how this quality related to children’s race (in the Jim Crow South, where schools and recreational facilities were segregated, childcare centers were likely segregated too). At the same time, the United States was debuting subsidized childcare, it was also incarcerating Japanese American families in internment camps. So although these childcare facilities were groundbreaking, they didn’t serve all children.
Subsidized Childcare Ends When War Ends
When the World War II childcare centers first opened, many women were reluctant to hand their children over to them. According to Chris M. Herbst, a professor of public affairs at Arizona State University who has written about these programs in the Journal of Labor Economics, a lot of these women ended up having positive experiences.
“A couple of childcare programs in California surveyed the mothers of the kids in childcare as they were leaving childcare programs,” he says. “Although they were initially skeptical of this government-run childcare program and were worried about the developmental effects on their kids, the exit interviews revealed very, very high levels of parental satisfaction with the childcare programs.”
As the war ended in August 1945, the Federal Works Agency announced it would stop funding childcare as soon as possible. Parents responded by sending the agency 1,155 letters, 318 wires, 794 postcards and petitions with 3,647 signatures urging the government to keep them open. In response, the U.S. government provided additional funding for childcare through February 1946. After that, it was over.
Lobbying for national childcare gained momentum in the 1960s and ‘70s, a period when many of its advocates may have themselves gone to World War II day care as kids. In 1971, Congress passed the Comprehensive Child Development Act, which would have established nationally-funded, locally-administered childcare centers.
This was during the Cold War, a time when anti-childcare activists pointed to the fact that the Soviet Union funded childcare as an argument for why the United States shouldn’t. President Richard Nixon vetoed the bill, arguing that it would “commit the vast moral authority of the National Government to the side of communal approaches to child rearing over against the family-centered approach.”
In this case, “family-centered” meant the mother should care for the children at home while the father worked outside of it—regardless of whether this was something the parents could afford or desired to do. World War II remains the only time in U.S. history that the country came close to instituting universal childcare.