The US Funded Universal Childcare During World War II—Then Stopped

A woman reading a story to three young children at a Child Care Center in New Britain, Connecticut. Photographed by Gordon Parks for Office of War Information, June 1943.

Federally-subsidized childcare centers took care of an estimated 550,000 to 600,000 children while their mothers worked wartime jobs.

When the United States started recruiting women for World War II factory jobs, there was a reluctance to call stay-at-home mothers with young children into the workforce. That changed when the government realized it needed more wartime laborers in its factories. To allow more women to work, the government began subsidizing childcare for the first (and only) time in the nation’s history.

An estimated 550,000 to 600,000 children received care through these facilities, which cost parents around 50 to 75 cents per child, per day (in 2021, that’s less than $12). But like women’s employment in factories, the day care centers were always meant to be a temporary wartime measure. When the war ended, the government encouraged women to leave the factories and care for their children at home. Despite receiving letters and petitions urging the continuation of the childcare programs, the U.S. government stopped funding them in 1946.

Before World War II, organized “day care” didn’t really exist in the United States. The children of middle- and upper-class families might go to private nursery schools for a few hours a day, says Sonya Michel, a professor emerita of history, women’s studies and American studies at the University of Maryland-College Park and author of Children’s Interests/Mothers’ Rights: The Shaping of America’s Child Care Policy. (In German communities, five- and six-year-olds went to half-day Kindergartens.)

For children from poor families whose father had died or couldn’t work, there were day nurseries funded by charitable donations, Michel says. But there were no affordable all-day childcare centers for families in which both parents worked—a situation that was common for low-income families, particularly Black families, and less common for middle- and upper-class families.

The war temporarily changed that. In 1940, the United States passed the Defense Housing and Community Facilities and Services Act, known as the Lanham Act, which gave the Federal Works Agency the authority to fund the construction of houses, schools and other infrastructure for laborers in the growing defense industry. It was not specifically meant to fund childcare, but in late 1942, the government used it to fund temporary day care centers for the children of mothers working wartime jobs.

Communities had to apply for funding to set up day care centers; once they did, there was very little federal involvement. Local organizers structured childcare centers around a community’s needs. Many offered care at odd hours to accommodate the schedules of women who had to work early in the morning or late at night. They also provided up to three meals a day for children, with some offering prepared meals for mothers to take with them when they picked up their kids.

“The ones that we often hear about were the ‘model’ day nurseries that were set up at airplane factories [on the West coast],” says Michel. “Those were ones where the federal funding came very quickly, and some of the leading voices in the early childhood education movement…became quickly involved in setting [them] up,” she says. 

For these centers, organizers enlisted architects to build attractive buildings that would cater to the needs of childcare, specifically. “There was a lot of publicity about those, but those were unusual. Most of the childcare centers were kind of makeshift. They were set up in church basements or garages.”

Though the quality of care varied by center, there hasn’t been much study of how this quality related to children’s race (in the Jim Crow South, where schools and recreational facilities were segregated, childcare centers were likely segregated too). At the same time, the United States was debuting subsidized childcare, it was also incarcerating Japanese American families in internment camps. So although these childcare facilities were groundbreaking, they didn’t serve all children.

Subsidized Childcare Ends When War Ends

Ruth Pease opened the Little Red School House in 1945 in response to the country's request for help in meeting the child care needs of the post-war community.
Ruth Pease opened the Little Red School House in 1945 in response to the country’s request for help in meeting the child care needs of the post-war community.

When the World War II childcare centers first opened, many women were reluctant to hand their children over to them. According to Chris M. Herbst, a professor of public affairs at Arizona State University who has written about these programs in the Journal of Labor Economics, a lot of these women ended up having positive experiences.

“A couple of childcare programs in California surveyed the mothers of the kids in childcare as they were leaving childcare programs,” he says. “Although they were initially skeptical of this government-run childcare program and were worried about the developmental effects on their kids, the exit interviews revealed very, very high levels of parental satisfaction with the childcare programs.”

As the war ended in August 1945, the Federal Works Agency announced it would stop funding childcare as soon as possible. Parents responded by sending the agency 1,155 letters, 318 wires, 794 postcards and petitions with 3,647 signatures urging the government to keep them open. In response, the U.S. government provided additional funding for childcare through February 1946. After that, it was over.

Lobbying for national childcare gained momentum in the 1960s and ‘70s, a period when many of its advocates may have themselves gone to World War II day care as kids. In 1971, Congress passed the Comprehensive Child Development Act, which would have established nationally-funded, locally-administered childcare centers.

This was during the Cold War, a time when anti-childcare activists pointed to the fact that the Soviet Union funded childcare as an argument for why the United States shouldn’t. President Richard Nixon vetoed the bill, arguing that it would “commit the vast moral authority of the National Government to the side of communal approaches to child rearing over against the family-centered approach.”

In this case, “family-centered” meant the mother should care for the children at home while the father worked outside of it—regardless of whether this was something the parents could afford or desired to do. World War II remains the only time in U.S. history that the country came close to instituting universal childcare.

US health care: An industry too big to fail

https://theconversation.com/us-health-care-an-industry-too-big-to-fail-118895

Image result for US health care: An industry too big to fail

As I spoke recently with colleagues at a conference in Florence, Italy about health care innovation, a fundamental truth resurfaced in my mind: the U.S. health care industry is just that. An industry, an economic force, Big Business, first and foremost. It is a vehicle for returns on investment first and the success of our society second.

This is critical to consider as presidential candidates unveil their health care plans. The candidates and the electorate seem to forget that health care in our country is a huge business.

Health care accounts for almost 20% of GDP and is a, if not the, job engine for the U.S. economy. The sector added 2.8 million jobs between 2006 and 2016, higher than all other sectors, and the Bureau of Labor Statistics projects another 18% growth in health sector jobs between now and 2026. Big Business indeed.

This basic truth separates us from every other nation whose life expectancy, maternal and infant mortality or incidence of diabetes we’d like to replicate or, better still, outperform.

As politicians and the public they serve grapple with issues such as prescription drug prices, “surprise” medical bills and other health-related issues, I believe it critical that we better understand some of the less visible drivers of these costs so that any proposed solutions have a fighting chance to deflect the health cost curve downward.

As both associate chief medical officer for clinical integration and director of the center for health policy at the University of Virginia, I find that the tension between a profit-driven health care system and high costs occupies me every day.

The power of the market

Housing prices are market-driven. Car prices are market-driven. Food prices are market-driven.

And so are health care services. That includes physician fees, prescription drug prices and non-prescription drug prices. So is the case for hospital administrator salaries and medical devices.

All of these goods or services are profit-seeking, and all are motivated to maximize profits and minimize the cost of doing business. All must adhere to sound business principles, or they will fail. None of them disclose their cost drivers, or those things that increase prices. In other words, there are costs that are hidden to consumers that manifest in the final unit prices.

To my knowledge, no one has suggested that Rolls-Royce Motor Cars should price its cars similarly to Ford Motor Company. The invisible hand of “the market” tells Rolls Royce and Ford what their vehicles are worth.

Prescription drugs pricing has different rules

Ford can (they won’t) tell you precisely how much each vehicle costs to produce, including all the component parts that they acquire from other firms. But this is not true of prescription drugs. How much a novel therapeutic costs to develop and bring to market is a proverbial black box. Companies don’t share those numbers. Researchers at the Tufts Center for the Study of Drug Development have estimated the costs to be as high as US$2.87 billion, but that number has been hotly debated.

What we can reliably say is that it’s very expensive, and a drug company must produce new drugs to stay in business. The millions of research and development(R&D) dollars invested by Big Pharma has two aims. The first is to bring the “next big thing” to market. The second is to secure the almighty patent for it.

U.S. drug patents typically last 20 years, but according to the legal services website Upcounsel.com: “Due to the rigorous amount of testing that goes into a drug patent, many larger pharmaceutical companies file several patents on the same drug, aiming to extend the 20-year period and block generic competitors from producing the same drug.” As a result, drug firms have 30, 40-plus years to protect their investment from any competition and market forces to lower prices are not in play.

Here’s the hidden cost punchline: concurrently, several other drugs in their R&D pipelines fail along the way, resulting in significant product-specific losses . How is a poor firm to stay afloat? Simple, really. Build those costs and losses into the price of the successes. Next thing you know, insulin is nearly US$1,500 for a 20-milliliter vial, when that same vial 15 years ago was about $157.

It’s actually a bit more complicated than that, but my point is that business principles drive drug prices because drug companies are businesses. Societal welfare is not the underlying use. This is most true in the U.S., where the public doesn’t purchase most of the pharmaceuticals – private individuals do, albeit through a third party, an insurer. The group purchasing power of 300 million Americans becomes the commercial power of markets. Prices go up.

The cost of doing business, er, treating

I hope that most people would agree that physicians provide a societal good. Whether it’s in the setting of a trusted health confidant, or the doctor whose hands are surgically stopping the bleeding from your spleen after that jerk cut you off on the highway, we physicians pride ourselves on being there for our patients, no matter what, insured or not.

Allow me to state two fundamental facts that often seem to elude patient and policymaker alike. They are inextricably linked, foundational to our national dialogue on health care costs and oft-ignored: physicians are among the highest earners in America, and we make our money from patients. Not from investment portfolios, or patents. Patients.

Like Ford or pharmaceutical giant Eli Lilly, physician practices also need to achieve a profit margin to remain in business. Similarly, there are hidden-to-consumer costs as well; in this case, education and training. Medical school is the most expensive professional degree money can buy in the U.S. The American Association of Medical Colleges reports that median indebtedness for U.S. medical schools was $200,000.00 in 2018, for the 75% of us who financed our educations rather than paying cash.
Our “R&D” – that is, four years each of college and medical school, three to 11 years of post doctoral training costs – gets incorporated into our fees. They have to. Just like Ford Motors. Business 101: the cost of doing business must be factored into the price of the good or service.

For policymakers to meaningfully impact the rising costs of U.S. health care, from drugs to bills to and everything in between, they must decide if this is to remain an industry or truly become a social good. If we continue to treat and regulate health care as an industry, we should continue to expect surprise bills and expensive drugs.

It’s not personal, it’s just…business. The question before the U.S. is: business-as-usual, or shall we get busy charting a new way of achieving a healthy society? Personally and professionally, I prefer the latter.