Segment 2 – Brief History of U.S. Healthcare

Segment 2 – Brief History of U.S. Healthcare


In Segment 2, I will answer the question, How Did We Get Here? I’ll give a whirlwind tour of the history of medical care in the U.S., and I’ll also look at the birth of health insurance.

Let’s start with looking at healthcare in the Colonial period. The most famous doctor at the time was Benjamin Rush. He – like most reputable professionals of the day – got his medical training in Europe, in his case Edinburgh, Scotland, the leading medical center of the time. Rush was a signer of the Declaration of Independence and served in the Revolutionary Army. He became the “father of American psychiatry” because of his interest in mental illness as a disease, not demon possession.

Rush and other orthodox practitioners in the early Republic –trained in the scientific European tradition– faced competition from a panoply of practitioners in an unlicensed, unregulated “free market.” They peddled nostrums like snake oil and procedures such as blood-letting. Doctors of all types trained like apprentices. The sick were cared for in their homes, with the poor going to almshouses and mentally ill to asylums. Port cities did have public pesthouses for quarantines.

By mid-19th century, orthodox doctors began trying to solidify their place in the market. They did this through training at medical schools, beginning with Harvard, Dartmouth, College of Philadelphia (which eventually became the University of Pennsylvania) and King’s College (which eventually became Columbia). But by 1850 there were now 42 medical schools that often were little more than diploma mills. The course consisted of only two semesters of 3 months each. The medical school needed only 4 faculty, 1 classroom, 1 dissection lab and a charter to grant degrees. These schools were highly profitable.

In 1847, the AMA (American Medical Association) was founded by the orthodox physicians.

Meanwhile, the era of scientific medicine was blossoming in Austria, Britain, Germany and France. Here are some milestones – anesthesia, microbiology study of invisible germs, antiseptic surgery technique and x-rays.

In America, by the turn of the century, doctors and the AMA sought to further shore up their legitimacy by reforming medical education. States began requiring more formal education as a condition for licensure. The Association of Medical Colleges was founded in 1876. In 1893, Dr. William Welch brought to Johns Hopkins University the German model of education based on 3 or 4 years of training in clinical sciences. Industrialist and philanthropist Andrew Carnegie hired Abraham Flexner in 1910 to draw up a blueprint for medical school reform. Flexner is widely credited with ushering in the era of modern medicine in this country.

In the early 20th century, doctors enjoyed prestige and independence. Courts rules against corporations practicing medicine, ensuring the pre-eminence of private practice. Doctors joined together in hospitals to take care of growing populations in big cities, and to exploit emerging surgical and diagnostic technologies.

This brings us to health insurance. Surgery (which made great advances during the Civil War and World War I) and hospitals were becoming expensive. So in 1929, Baylor College started the first pre-paid hospital insurance. Baylor’s 1,200 teachers each paid 50 cents per month to cover up to 21 days of hospitalization. Surgeons and hospitals quickly embraced this arrangement, and the Baylor plan became the Blue Cross plan in Minnesota in 1933 and Texas in 1934. By 1950, just 15 years later, Blue Cross covered 57% of the population.

Here’s how it happened. After World War I, the war-torn countries of Europe, like Germany, were in turmoil. Social insurance, including healthcare, helped reestablish some social stability there. But in this country, politicians opposed Teddy Roosevelt’s plan to set up national health insurance for factory workers, calling public insurance a “Prussian menace”. Labor unions saw public health insurance as an encroachment on their special role to ensure worker benefits. The AMA also opposed public health insurance as a potential “interference with the practice of medicine.

Then during World War II, wages were frozen, but companies were allowed to give health insurance benefits instead. This sewed the seeds of the employer-based health insurance system. In 1948 the Supreme Court decided that unions could use health benefits in collective bargaining agreements. Then in 1954 Congress made employer-paid health premiums non-taxable. By the mid-1960s employer-paid health insurance was nearly universal.

Let’s summarize the history of medicine from Rush to Medicare. In the Colonial period and early Republic there was intense competition among doctors of all stripes, those that tried to understand the scientific basis of disease and those who peddled remedies on a trial-and-error basis (referred to as “empirical”), relying mostly on their placebo effect. We still see vestiges of this early competition today in the rivalry between MDs, DOs, chiropractors and podiatrists. During the industrial period, little by little science-based orthodox physicians in the European tradition prevailed over their rivals introducing advances in surgery, diagnosis and infection control. They shored up their gains with institutions such as the AMA, hospitals, and eventually insurance.

In the next Segment, we will look at reform movements, starting with Medicare and Medicaid in the 1960s. We will also look at why later reforms failed and where that leaves us now.

I’ll see you then.






Image result for NYU medical school

The factors at play may be a bit more complicated that some expect.

New York University made waves last week when it announced it will cover all tuition expenses for all its medical students in perpetuity.

While the news left some NYU alumni wishing they had waited just a few years longer to enroll, it also reanimated debate over the problems high educational debt levels can cause for doctors and the healthcare delivery system at large.

In an op-ed for Stat News, fourth-year NYU medical student Eli Cahan noted that researchers have identified debt burden as one of the factors propelling doctors into more lucrative specialty areas, leaving a shortage of primary care physicians.

“Eliminating medical school tuition, and thus medical school debt, could help nudge more students to choose much-needed careers in primary care,” Cahan wrote.

But research suggests the burden of educational debt is one factor among several in a complex relationship affecting the likelihood of new doctors to choose primary care.

A study published in 2014 by the Annals of Family Medicine, for example, concluded that reducing debt for medical students could help to enlarge the primary care physician workforce but that the positive effects of such an initiative would vary depending on which types of students benefited from the debt reduction.

“Students from lower-income families, in general, tend to have more interest in primary care careers, and students from disadvantaged backgrounds tend to have more interest in primary care careers,” lead researcher Julie Phillips, MD, MPH, an associate professor of family medicine at Michigan State University College of Human Medicine, tells HealthLeaders.

“Those same students also tend to have more debt because students from lower-income families don’t have the parents’ resources to help them with medical school costs,” Phillips adds.

The study, which analyzed data from more than 136,000 physicians, found that students from higher-income families were significantly less influenced by educational debt in their specialization decisions. What’s more, although the researchers found that educational debt deterred graduates of public medical schools from choosing primary care, the same could not be said of graduates of private medical schools.

This research suggests that the benefits of tuition-free medical school on primary care could be diminished by the fact that NYU is a private institution. That’s not to say, of course, that the decision to go tuition-free comes without benefits.

“I think it will make a difference. It’s hard to know if it will make a big difference,” Phillips says, calling NYU’s announcement “a great step in the right direction.”

“It would be really wonderful if public schools were able to implement this,” she adds, “but I just think they don’t typically have those resources.”

“The income disparity between primary care doctors and specialists in the United States is concerning, and there’s pretty good evidence that if you change that, the whole game changes,” Phillips says.

“As a culture, we tend to look on people who make money as being more valuable,” she adds, “so the income disparity between primary care doctors and specialists sort of creates a culture or contributes to a culture where primary care is not as well-respected, and that’s a really big problem.”




The doctor of the future


In new healthcare systems, ‘the doctor’ is increasingly a team. Can actual physicians adapt?

When patients go to see Dr. C.T. Lin for a checkup, they don’t see just Dr. Lin. They see Dr. Lin and Becky.

Becky Peterson, the medical assistant who works with Lin, sits down with patients first and asks them about their symptoms and medical history—questions Lin used to ask. When Lin comes in the room, she stays to take notes and cue up orders for tests and services such as physical therapy. When he leaves, she makes sure the patient understands his instructions.

The division of labor lets Lin stay focused on listening to patients and solving problems. “Now I’m just left with the assessment and the plan—the medical decisions—which is really my job,” Lin says in a quiet moment after seeing a patient at the Denver clinic where he works.

For generations, when Americans sought health care, they went to see their family doctor. But these days, they’ll often sit down with a physician assistant or nurse practitioner instead. Or they’ll spend a large part of their visit talking to a non-doctor, like Peterson, who takes care of an increasing number of tasks doctors used to handle.

Driven by efforts to control costs and improve outcomes, it’s one of the biggest shifts in the American health care workforce. Medicine increasingly looks like team sport, with duties and jobs that used to fall to a family doctor now executed by a team, from nurses who sit down with patients to discuss diet and exercise to clinical pharmacists who monitor a patient’s medication. The doctor, in this model, is a kind of quarterback, overseeing care plans, stepping in mostly for the toughest cases and most difficult decisions.

Under some models, the doctor may recede even further into the background, leaving advanced practice nurses or other highly qualified professionals in charge.

It’s no longer true “that you’re a sole cowboy out there, saving the patient on your own,” says Mark Earnest, head of internal medicine at the University of Colorado medical school.

The shifting role of doctors is expected to accelerate in the coming decades, as the number of older Americans increases dramatically, many of them living longer with chronic diseases that need monitoring but not necessarily the expensive attention of a physician at every visit.

Doctors increasingly oversee the work of a team of medical professionals, including nurses and medical assistants, who handle much of the direct interaction with patients.

This isn’t the job many physicians trained for—or that some want. Even doctors who support team-based care have trouble adjusting to the new workflow. Some don’t like the idea that they aren’t always the ones in charge. Others, sick of the industry pressures, are opting out and setting up independent practices that don’t accept health insurance.

But most doctors will have to adapt. Change is coming, regardless of the fate of the Affordable Care Act or other laws designed to reward health systems for outcomes rather than the number of procedures performed, says Randall Wilson, an associate research director for Jobs for the Future, a nonprofit that advocates for increasing job skills. “People see the writing on the wall,” he says.

New models

Americans spend more on health care than people in other wealthy nations. Yet Americans live shorter lives and are more likely to be obese or hospitalized for chronic conditions, such as asthma or diabetes.

Health care experts have long blamed these lousy results on our fragmented health care system. Americans rely on a mix of specialists and settings for care, but those pieces of the health care system don’t necessarily communicate or coordinate with each other.

They also blame the high costs partly on the fee-for-service payment system, which rewards hospitals, clinics and doctors for the volume of procedures they provide. Health insurers will pay for a patient sit down with a doctor. What they sometimes don’t pay for are other services that help patients stay healthy, such as a visit from a community health or a phone call with a nurse. Yet such services can prevent medical emergencies and save her and her insurer a lot of money on expensive treatments.

New payment models encourage health systems to deploy their workers more efficiently — while also avoiding unnecessary services and costly errors. For instance, Medicare already gives some hospitals a single payment to cover everything that happens to a patient from the moment he enters a hospital for knee replacement surgery to three months after he goes home.

Distributing work across team members can help keep costs down, relieve doctors of the busywork that jams up their day, and make everyone more productive.

At least, that’s the idea. There isn’t yet strong research that proves teams provide better or cheaper care, says Erin Fraher, director of the Carolina Health Workforce Research Center, a national research center at the University of North Carolina. Studies do show that nurse practitioners can deliver care as well as physicians, “but talking about substitution of one provider for another is not team-based care,” she says.

Major physician associations support improving teamwork and collaboration among health care professionals. So do medical school leaders. For some years now, accreditors have required colleges and universities that train doctors, nurses, pharmacists, dentists and public health experts to teach students to work in interprofessional teams.

But when it comes to the question of who is in charge, that’s where friction arises. Many doctors aren’t comfortable with the idea that they don’t always need to be in charge. The American College of Physicians will say a physician must always lead care teams, says Ken Shine, professor of medicine at the Dell Medical School at the University of Texas at Austin, but he disagrees.

“My argument is there are situations where another health professional needs to be directing the team,” Shine says. For instance, a nutritionist could create and manage a care plan for a diabetic patient.

Medical associations have also pushed back against proposals to expand the medical decisions non-doctors are able to do make on their own. Health professionals’ so-called “scope of practice” is governed by laws that vary from state to state. “While some scope expansions may be appropriate, others definitely are not,” the American Medical Association says on its website.

In a statement, the association says it “encourages physician-led health care teams that utilize the unique knowledge and valuable contributions of all clinicians to enhance patient outcomes.” It noted that top hospital systems are using physician-led teams to improve patients’ health while reducing costs.

To be sure, doctors aren’t being displaced anytime soon. But shifting tasks to other professionals reduces the need to train so many of them. According to a study by the Rand Corporation, a nonpartisan think tank, a standard primary care team model requires about 7 doctors per 10,000 patients. Increasing the numbers of nurse practitioners and physician assistants can drop that ratio to six doctors per 10,000, and in clinics run by highly trained nurses (known as nurse-managed health centers) the ratio drops to less than one doctor per 10,000.

Culture Change

Hospital systems like UCHealth, the University of Colorado-affiliated system where Lin and Peterson work, are betting that the future of health care involves a mix of professionals sharing responsibility for patients. Doctors will still run the show, but they’ll have to give up some control.

That culture change makes many doctors uneasy at first. Doctors want to protect their one-one-one relationship with patients. They may not understand what their non-physician colleagues have been trained to do, or are legally able to do. And many worry that change will make them even busier, by forcing them to manage the lower-credentialed professionals around them.

Lin is the chief information officer for UCHealth. As an administrator, he’s always pushing for change—his latest project is a system that releases certain test results to patients in real time. But as a practicing doctor, he also understands that change is hard.

He says that having Peterson in the examination room with him took some getting used to. “Like many doctors, I have a fear of letting go of all the things I traditionally do,” he says. That includes documenting a visit. “I’m getting over it, because I don’t want to be the only one here at 8 o’clock at night, typing.”

Matt Moles, a doctor who practices in the same clinic, says he also initially felt uncomfortable. Sharing the examination room went against his medical training, he says: “We’re trained to trust no one.”

It’s still possible for doctors to have jobs that resemble the Norman Rockwell era of long consultations—if they’re willing to opt out of the mainstream. A small but growing number are setting up or joining practices that, rather than taking health insurance, charge patients a monthly fee—typically around $75— for unlimited visits.

“I personally have the mentality of—leave me alone, I’ll take care of my patients,” says Dr. Cory Carroll, when reached by phone at his family care practice in Fort Collins, Colorado. He’s been a solo practitioner for most of his 25-year career.

Carroll has about 300 patients, a fraction of the patient load of a typical doctor in a big health care system. He sits with patients for over an hour if he has to. He visits them at home. He helps them connect with social services and community organizations. And he can focus on what he loves most: teaching patients to eat a healthier diet.

His practice is proof that it’s still possible for a family doctor to do it all. But he emphasizes that his experience is unusual. “I’m absolutely an outlier,” he says. Less than a quarter of all internal medicine doctors in the U.S. have a solo practice, according to the American Medical Association’s latest survey. And although the model Carroll has embraced is growing, it serves a more affluent slice of the patient population than a major hospital system such as UCHealth.

The team-based future

UCHealth’s leaders are so sure that team-based care is the future that newly built clinics, such as the one in Denver’s Lowry neighborhood at which Lin and Peterson work, are literally built for teamwork. Examination rooms don’t line long hallways; instead, they ring desk space where nurses, physicians and medical assistants sit side-by-side.

But the clinic is still in the early stages of transforming its teams. The best place in Denver to watch a diverse set of health professionals working together is across town, at a facility run by Denver Health, the city’s public safety-net hospital system. The facility includes a primary care clinic, an urgent care center and a pharmacy.

One recent morning, the distant wail of a baby in the waiting room announced the start of another busy day. Doctors, physician assistants, nurse practitioners and medical assistants were already typing away at the computers in their cubicles, trying to get a head start before the first patients were shown in to examination rooms.

“A lot of Denver Health patients are so complex,” explains Dr. Benjamin Feijoo, looking up from his desk. Patients often have multiple health issues, too many to handle in a typical 20-minute visit. “It’s a bit of a crunch,” he says.

So Feijoo turns to his colleagues for help. For instance, if a patient has both a medical and a mental health issue, Feijoo can address the medical problem and then ask a mental health specialist to step into the examination room and tackle the mental health problem.

If a patient needs, say, a crash course on prenatal health, she can meet with a nurse for an hourlong discussion. And if a living situation is compromising a patient’s health—such as unstable housing, or insufficient access to healthy food—the clinic’s social worker will try to find a solution.

The clinic also employs two community health workers, who spread the word about Denver Health in low-income neighborhoods, and a patient navigator, who calls the clinic’s patients when they leave a Denver Health hospital (and, for a subset of patients, other major local hospitals) and helps them schedule a follow-up appointment with their primary care provider.

Denver Health began expanding its care teams in 2012, when it received a $20 million federal grant. The system spent about half the money on hiring staff such as social workers, patient navigators and clinical pharmacists and the rest on software that identifies patients who are spending avoidable time in the hospital, including people who are homeless or have a serious but treatable condition, such as HIV. New, smaller clinics wrap even more services around those patients, allowing them to come in for multi-hour visits.

The new system now saves Denver Health—an integrated system, which includes a health plan—so much money on hospital stays and emergency room visits that it covers the salaries of the additional hires, says Tracy Johnson, the director of health reform initiatives for the system.

Reconfiguring care teams has made financial sense for UCHealth, too. Although the clinic where Lin and Peterson work has roughly twice as many medical assistants today as it had a year ago—plus a social worker and nurse manager—the configuration saves doctors so much time that they’re able to see more patients each day. The extra visits bring in enough money to cover the cost of adding more employees.

“The reason a lot of this happened is physician burnout was significant, especially in primary care,” says Dr. Carmen Lewis, the medical director of the Lowry clinic. The redesigned teams launched earlier this year aim to make doctors’ lives less stressful.

Patients across the UCHealth system don’t seem to mind the change. A few will ask to speak with their doctor in private, but others are more open with the medical assistant than with their doctor. “Sometimes, they don’t feel as judged,” Peterson says.

Lin says that since he’s started working with Peterson, his patients have been better able to keep their blood pressure and diabetes under control. “Patients will forget to tell me that they’re out of prescriptions,” he says—or he’ll be so busy tackling a more immediate problem that he’ll forget to ask.

With a medical assistant methodically asking all the opening questions, crucial details such as prescription renewals no longer slip through the cracks.

Rethinking medical school

Medical school leaders want to make sure the next generation of doctors has the skills and mind-set the jobs of the future will require—such as the ability to lead teams effectively, draw insights from data sets and guide patients through a system full of bewildering treatments, care settings and payment options.

Students traditionally spend the first two years of medical school learning science in classrooms and two years getting hands-on experience at clinical sites. That’s no longer enough, says Susan Skochelak, group vice president for medical education at the American Medical Association.

She says students need to understand “health system science”—everything from how health insurance works to how factors such as income and education affect health. “We had medical students who were graduating, not knowing the difference between Medicare and Medicaid,” she says.

So in 2013 the AMA began issuing grants to medical schools that wanted to do things differently. One program allowed Indiana University to put anonymous patient data into an electronic health record students can use to search for clues to a patient’s health—such as whether he is showing signs of opioid addiction. Another grant allowed Pennsylvania State University to create a new curriculum that requires medical students to work as patient navigators.

“Brand new medical students—they totally get the need for this,” says Robert Pendleton, a professor of internal medicine at the University of Utah and the university hospital system’s chief medical quality officer. At this year’s kickoff for an elective curriculum on data and performance measurement, he says, students packed the auditorium.

And all medical schools are trying to emphasize teamwork. At the University of Colorado medical school, the idea that doctors should treat non-doctors as partners—not subordinates—is impressed on students from Day One, says Harin Parikh, a second-year student.

The medical school shares a campus with education programs for six other health professions. Students hang out on the same quad, grab lunch in the same places, and even take some classes together. In a required first-year class, students from a mix of health fields are split into teams and are asked to plan a response to given scenarios. One day, a nursing student might lead the team; the next, a pharmacy student.

Parikh says the team-based approach makes sense to him. “From a provider perspective, it’s about checks and balances,” he says. When multiple people, with different kinds of expertise, come together around a patient, one may notice something the others don’t.

Reorienting medical schools, like reorienting hospital systems, will take time. Scheduling barriers can make it hard to get students from different health fields in one room, for instance. Some faculty members aren’t prepared to teach a new kind of curriculum. And when students leave school for their clinical training, they work in real-life settings that are all over the spectrum when it comes to teamwork.

“We’re working on an ideal,” says John Luk, assistant dean for interprofessional integration at the Dell Medical School at the University of Texas at Austin. “But the reality is, many of us have not been practicing at the ideal.”

Med School Grads Go to Work for Hedge Funds


More are starting biotech companies or joining consulting or financial firms instead of practicing—all while the U.S. suffers a shortage of doctors.

Matthew Alkaitis, a third-year student at Harvard Medical School, is calm, friendly, and a good listener—the kind of qualities you’d want in a doctor. But though he spends 14 hours a day studying for his board exams, the 29-year-old isn’t sure how long he’ll be wearing a white coat. In September, Alkaitis, who also has a Ph.D. in biomedical sciences, will be starting a two-year fellowship at McKinsey & Co., where he’ll be advising clients in the health-care field. “I really hope that my career involves a period of dedicated time taking care of patients,” he says. “But I also have this competing goal to one day start or help build out a company that really adds something new and interesting and innovative to the medical system.”

Like Alkaitis, more people are coming out of medical school and choosing not to practice medicine. Instead, they’re going into business—starting biotech and medical device companies, working at private equity firms, or doing consulting. In a 2016 survey of more than 17,000 med school grads by the Physicians Foundation and health-care recruitment firm Merritt Hawkins, 13.5 percent said they planned to seek a nonclinical job within three years. That’s up from 9.9 percent in 2012. A separate Merritt Hawkins survey asks final-year residents: “If you were to begin your education again, would you study medicine or would you select another field?” In 2015, 25 percent answered “another field,” up from 8 percent in 2006. Among the reasons they cited: a lack of free time, educational debt, and the hassle of dealing with insurance companies and other third-party payers.

The trend is worrying, as the U.S. already suffers a shortage of doctors, especially in rural areas. “If you have a large number of people out training to see patients and taking care of people in our communities, then all of a sudden deciding not to, that’s a concern,” says Atul Grover, executive vice president of the Association of American Medical Colleges. The AAMC projects a nationwide deficit of as many as 100,000 doctors by 2030.

“I think that we are at a crossroads,” says Dr. Kevin Campbell, a cardiologist in Raleigh, N.C. “I trained in the early ’90s, and back then you definitely were thought of as a sellout or a second-class citizen if you weren’t going into clinical medicine.”

Medical students have more options nowadays. Medical and business schools are teaming up to offer joint degrees. There were 148 students enrolled in M.D.-MBA programs in 2016, up from 61 in 2003, according to the AAMC. At Harvard Medical School, in a class of about 160 students, about 14 will pursue the joint degree, and an additional 25 or 30 will do master’s in other areas, such as law and public policy. “We have some students who want to go back to the Midwest and practice in a community setting,” says Dr. Anthony D’Amico, a professor of radiation oncology at Harvard Medical School and an advisory dean. And then there are those “who want to implement skill sets they’ve been blessed with and apply them on a broader scale.”

Dr. Rodney Altman of San Francisco says the time he spends treating patients in the emergency room informs his work as a managing director at Spindletop Capital, a private equity firm that invests in health-care companies. “I really wanted to practice health care on a macro level,” says Altman. “For me the one-on-one interaction with patients, while important and rewarding, wouldn’t have been as rewarding as being able to impact a larger number of patients.”

Altman says his mentors and colleagues had mixed feelings when, after a decade of practicing full time, he decided to dial back his hours in the emergency room. “Most people were supportive, a lot were envious, and some appropriately cautioned me about the risks I would be taking,” he says. “Out in the business world, you’re subject to the whims of the capital markets and to a lot more that is out of one’s control. I think medicine is quite safe and secure in that way.”

Some consulting companies are also stepping up hiring of doctors. Steffi Langner, a spokeswoman for McKinsey, says her firm is actively recruiting doctors because the analytical skills necessary to be an M.D. are similar to the problem-solving skills a consultant needs.

Dr. Jon Bloom trained as an anesthesiologist and practiced for three months, then enrolled at Massachusetts Institute of Technology’s Sloan School of Management. He says he was inspired by other doctors he knew who were inventors and entrepreneurs. One reason more are choosing that path is that investors are willing to fund them. Figures compiled by the National Venture Capital Associationshow that investment in medical-related startups climbed from $9.4 billion in 2007 to $11.9 billion in 2016.

Bloom is co-founder and chief executive officer of Podimetrics, a startup in Somerville, Mass., that has developed a mat device that predicts and prevents diabetic foot ulcers. He says that even though his invention is now on the market after receiving approval from the U.S. Food and Drug Administration in 2015, he’s still living the startup life. “I definitely don’t make nearly as much as what a doctor makes. That wasn’t really important to me,” he says. “My friends who graduated residency many years ago, they have multiple cars, fabulous houses. They did OK. I still occasionally eat ramen noodles,” he chuckles.

BOTTOM LINE – A U.S. deficit of doctors may worsen as a growing minority of medical school grads are choosing other professions.

Kaiser releases plans for California medical school: 6 things to know

Oakland, Calif.-based Kaiser Permanente revealed plans to replace a former office building and parking lot in Pasadena, Calif., with an 80,000-square-foot medical school, according to the Pasadena Star-News.

Here are six things to know about the proposed medical school.

1. The four-story building will reportedly feature floor-to-ceiling windows on the first floor, an open rooftop and state-of-the-art classrooms.

2. The health system previously disclosed plans to build the medical school in 2015. Officials chose to build in Pasadena because of the location’s proximity to other Kaiser facilities and access to affordable housing, public transit and major highways, according to the report.

3. Two additional Kaiser-owned research and administrative facilities will remain on site.

4. The health system expects to open the medical school in 2019 with an inaugural class of 48 students. By 2022, officials expect a full enrollment of 192 students, according to the report.

5. The School of Medicine will not maintain any clinics or patients within the facility. Instead, students will travel to the health system’s 14 other locations in southern California to gain hands-on experience.

6. Officials said they hope to break ground on the facility by the end of the year. The proposal must first undergo environmental and design reviews by the city’s design commission, according to the report.