Florida’s COVID Response Includes Missing Deadlines and Data

Blog | Florida's COVID-19 Data: What We Know, What's Wrong, and What's  Missing | The COVID Tracking Project

 Since the beginning of the coronavirus pandemic, Florida has blocked, obscured, delayed, and at times hidden the COVID-19 data used in making big decisions such as reopening schools and businesses.

And with scientists warning Thanksgiving gatherings could cause an explosion of infections, the shortcomings in the state’s viral reporting have yet to be fixed.

While the state has put out an enormous amount of information, some of its actions have raised concerns among researchers that state officials are being less than transparent.

It started even before the pandemic became a daily concern for millions of residents. Nearly 175 patients tested positive for the disease in January and February, evidence the Florida Department of Health collected but never acknowledged or explained. The state fired its nationally praised chief data manager, she says in a whistleblower lawsuit, after she refused to manipulate data to support premature reopening. The state said she was fired for not following orders.

The health department used to publish coronavirus statistics twice a day before changing to once a day, consistently meeting an 11 a.m. daily deadline for releasing new information that scientists, the media and the public could use to follow the pandemic’s latest twists.

But in the past month the department has routinely and inexplicably failed to meet its own deadline by as much as six hours. On one day in October, it published no update at all.

News outlets were forced to sue the state before it would publish information identifying the number of infections and deaths at individual nursing homes.

Throughout it all, the state has kept up with the rapidly spreading virus by publishing daily updates of the numbers of cases, deaths and hospitalizations.

“Florida makes a lot of data available that is a lot of use in tracking the pandemic,” University of South Florida epidemiologist Jason Salemi said. “They’re one of the only states, if not the only state, that releases daily case line data (showing age, sex and county for each infected person).”

Dr. Terry Adirim, chairwoman of Florida Atlantic University’s Department of Integrated Biomedical Science, agreed, to a point.

“The good side is they do have daily spreadsheets,” Adirim said. “However, it’s the data that they want to put out.”

The state leaves out crucial information that could help the public better understand who the virus is hurting and where it is spreading, Adirim said.

The department, under state Surgeon General Dr. Scott Rivkees, oversees 53? health agencies covering Florida’s 67 counties, such as the one in Palm Beach County headed by Dr. Alina Alonso.

Rivkees was appointed in April 2019. He reports to Gov. Ron DeSantis, a Republican who has supported President Donald Trump’s approach to fighting the coronavirus and pressured local officials to reopen schools and businesses despite a series of spikes indicating rapid spread of the disease.

At several points, the DeSantis administration muzzled local health directors, such as when it told them not to advise school boards on reopening campuses.

DOH Knew Virus Here Since January

The health department’s own coronavirus reports indicated that the pathogen had been infecting Floridians since January, yet health officials never informed the public about it and they did not publicly acknowledge it even after The Palm Beach Post first reported it in May.

In fact, the night before The Post broke the story, the department inexplicably removed from public view the state’s dataset that provided the evidence. Mixed among listings of thousands of cases was evidence that up to 171 people ages 4 to 91 had tested positive for COVID-19 in the months before officials announced in March the disease’s presence in the state.

Were the media reports on the meaning of those 171 cases in error? The state has never said.

No Testing Stats Initially

When positive tests were finally acknowledged in March, all tests had to be confirmed by federal health officials. But Florida health officials refused to even acknowledge how many people in each county had been tested.

State health officials and DeSantis claimed they had to withhold the information to protect patient privacy, but they provided no evidence that stating the number of people tested would reveal personal information.

At the same time, the director of the Hillsborough County branch of the state health department publicly revealed that information to Hillsborough County commissioners.

And during March the state published on a website that wasn’t promoted to the public the ages and genders of those who had been confirmed to be carrying the disease, along with the counties where they claimed residence.

Firing Coronavirus Data Chief

In May, with the media asking about data that revealed the earlier onset of the disease, internal emails show that a department manager ordered the state’s coronavirus data chief to yank the information off the web, even though it had been online for months.

A health department tech supervisor told data manager Rebekah Jones on May 5 to take down the dataset. Jones replied in an email that was the “wrong call,” but complied, only to be ordered an hour later to put it back.

That day, she emailed reporters and researchers following a listserv she created, saying she had been removed from handling coronavirus data because she refused to manipulate datasets to justify DeSantis’ push to begin reopening businesses and public places.

Two weeks later, the health department fired Jones, who in March had created and maintained Florida’s one-stop coronavirus dashboard, which had been viewed by millions of people, and had been praised nationally, including by White House Coronavirus Task Force Coordinator Deborah Birx.

The dashboard allows viewers to explore the total number of coronavirus cases, deaths, tests and other information statewide and by county and across age groups and genders.

DeSantis claimed on May 21 that Jones wanted to upload bad coronavirus data to the state’s website. To further attempt to discredit her, he brought up stalking charges made against her by an ex-lover, stemming from a blog post she wrote, that led to two misdemeanor charges.

Using her technical know-how, Jones launched a competing COVID-19 dashboard website, FloridaCOVIDAction.com in early June. After national media covered Jones’ firing and website launch, people donated more than $200,000 to her through GoFundMe to help pay her bills and maintain the website.

People view her site more than 1 million times a day, she said. The website features the same type of data the state’s dashboard displays, but also includes information not present on the state’s site such as a listing of testing sites and their contact information.

Jones also helped launch TheCOVIDMonitor.com to collect reports of infections in schools across the country.

Jones filed a whistleblower complaint against the state in July, accusing managers of retaliating against her for refusing to change the data to make the coronavirus situation look better.

“The Florida Department of Health needs a data auditor not affiliated with the governor’s office because they cannot be trusted,” Jones said Friday.

Florida Hides Death Details

When coronavirus kills someone, their county’s medical examiner’s office logs their name, age, ethnicity and other information, and sends it to the Florida Department of Law Enforcement.

During March and April, the department refused requests to release that information to the public, even though medical examiners in Florida always have made it public under state law. Many county medical examiners, acknowledging the role that public information can play in combating a pandemic, released the information without dispute.

But it took legal pressure from news outlets, including The Post, before FDLE agreed to release the records it collected from local medical examiners.

When FDLE finally published the document on May 6, it blacked out or excluded crucial information such as each victim’s name or cause of death.

But FDLE’s attempt to obscure some of that information failed when, upon closer examination, the seemingly redacted details could in fact be read by common computer software.

Outlets such as Gannett, which owns The Post, and The New York Times, extracted the data invisible to the naked eye and reported in detail what the state redacted, such as the details on how each patient died.

Reluctantly Revealing Elder Care Deaths, Hospitalizations

It took a lawsuit against the state filed by the Miami Herald, joined by The Post and other news outlets, before the health department began publishing the names of long-term care facilities with the numbers of coronavirus cases and deaths.

The publication provided the only official source for family members to find out how many people had died of COVID-19 at the long-term care facility housing their loved ones.

While the state agreed to publish the information weekly, it has failed to publish several times and as of Nov. 24 had not updated the information since Nov. 6.

It took more pressure from Florida news outlets to pry from the state government the number of beds in each hospital being occupied by coronavirus patients, a key indicator of the disease’s spread, DeSantis said.

That was one issue where USF’s Salemi publicly criticized Florida.

“They were one of the last three states to release that information,” he said. “That to me is a problem because it is a key indicator.”

Confusion Over Positivity Rate

One metric DeSantis touted to justify his decision in May to begin reopening Florida’s economy was the so-called positivity rate, which is the share of tests reported each day with positive results.

But Florida’s daily figures contrasted sharply with calculations made by Johns Hopkins University, prompting a South Florida Sun-Sentinel examination that showed Florida’s methodology underestimated the positivity rate.

The state counts people who have tested positive only once, but counts every negative test a person receives until they test positive, so that there are many more negative tests for every positive one.

John Hopkins University, on the other hand, calculated Florida’s positivity rate by comparing the number of people testing positive with the total number of people who got tested for the first time.

By John Hopkins’ measure, between 10 and 11 percent of Florida’s tests in October came up positive, compared to the state’s reported rate of between 4 and 5 percent.

Health experts such as those at the World Health Organization have said a state’s positivity rate should stay below 5 percent for 14 days straight before it considers the virus under control and go forward with reopening public places and businesses. It’s also an important measure for travelers, who may be required to quarantine if they enter a state with a high positivity rate.

Withholding Detail on Race, Ethnicity

The Post reported in June that the share of tests taken by Black and Hispanic people and in majority minority ZIP codes were twice as likely to come back positive compared to tests conducted on white people and in majority white ZIP codes.

That was based on a Post analysis of internal state data the health department will not share with the public.

The state publishes bar charts showing general racial breakdowns but not for each infected person.

If it wanted to, Florida’s health department could publish detailed data that would shed light on the infection rates among each race and ethnicity or each age group, as well as which neighborhoods are seeing high rates of contagion.

Researchers have been trying to obtain this data but “the state won’t release the data without (making us) undergo an arduous data use agreement application process with no guarantee of release of the data,” Adirim said. Researchers must read and sign a 26-page, nearly 5,700-word agreement before getting a chance at seeing the raw data.

While Florida publishes the ages, genders and counties of residence for each infected person, “there’s no identification for race or ethnicity, no ZIP code or city of the residence of the patient,” Adirim said. “No line item count of negative test data so it’s hard to do your own calculation of test positivity.”

While Florida doesn’t explain its reasoning, one fear of releasing such information is the risk of identifying patients, particularly in tiny, non-diverse counties.

Confusion Over Lab Results

Florida’s daily report shows how many positive results come from each laboratory statewide. Except when it doesn’t.

The report has shown for months that 100 percent of COVID-19 tests conducted by some labs have come back positive despite those labs saying that shouldn’t be the case.

While the department reported in July that all 410 results from a Lee County lab were positive, a lab spokesman told The Post the lab had conducted roughly 30,000 tests. Other labs expressed the same confusion when informed of the state’s reporting.

The state health department said it would work with labs to fix the error. But even as recently as Tuesday, the state’s daily report showed positive result rates of 100 percent or just under it from some labs, comprising hundreds of tests.

Mistakenly Revealing School Infections

As DeSantis pushed in August for reopening schools and universities for students to attend in-person classes, Florida’s health department published a report showing hundreds of infections could be traced back to schools, before pulling that report from public view.

The health department claimed it published that data by mistake, the Miami Herald reported.

The report showed that COVID-19 had infected nearly 900 students and staffers.

The state resumed school infection reporting in September.

A similar publication of cases at day-care centers appeared online briefly in August only to come down permanently.

Updates Delayed

After shifting in late April to updating the public just once a day at 11 a.m. instead of twice daily, the state met that deadline on most days until it started to falter in October. Pandemic followers could rely on the predictability.

On Oct. 10, the state published no data at all, not informing the public of a problem until 5 p.m.

The state blamed a private lab for the failure but the next day retracted its statement after the private lab disputed the state’s explanation. No further explanation has been offered.

On Oct. 21, the report came out six hours late.

Since Nov. 3, the 11 a.m. deadline has never been met. Now, late afternoon releases have become the norm.

“They have gotten more sloppy and they have really dragged their feet,” Adirim, the FAU scientist, said.

No spokesperson for the health department has answered questions from The Post to explain the lengthy delays. Alberto Moscoso, the spokesman throughout the pandemic, departed without explanation Nov. 6.

The state’s tardiness can trip up researchers trying to track the pandemic in Florida, Adirim said, because if one misses a late-day update, the department could overwrite it with another update the next morning, eliminating critical information and damaging scientists’ analysis.

Hired Sports Blogger to Analyze Data

As if to show disregard for concerns raised by scientists, the DeSantis administration brought in a new data analyst who bragged online that he is no expert and doesn’t need to be.

Kyle Lamb, an Uber driver and sports blogger, sees his lack of experience as a plus.

“Fact is, I’m not an ‘expert’,” Lamb wrote on a website for a subscribers-only podcast he hosts about the coronavirus. “I also don’t need to be. Experts don’t have all the answers, and we’ve learned that the hard way throughout the entire duration of the global pandemic.”

Much of his coronavirus writings can be found on Twitter, where he has said masks and mandatory quarantines don’t stop the virus’ spread, and that hydroxychloroquine, a drug touted by President Donald Trump but rejected by medical researchers, treats it successfully.

While DeSantis says lockdowns aren’t effective in stopping the spread and refuses to enact a statewide mask mandate, scientists point out that quarantines and masks are extremely effective.

The U.S. Food and Drug Administration has said hydroxychloroquine is unlikely to help and poses greater risk to patients than any potential benefits.

Coronavirus researchers have called Lamb’s views “laughable,” and fellow sports bloggers have said he tends to act like he knows much about a subject in which he knows little, the Miami Herald reported.

DeSantis has yet to explain how and why Lamb was hired, nor has his office released Lamb’s application for the $40,000-a-year job. “We generally do not comment on such entry level hirings,” DeSantis spokesman Fred Piccolo said Tuesday by email.

It could be worse.

Texas health department workers have to manually enter data they read from paper faxes into the state’s coronavirus tracking system, The Texas Tribune has reported. And unlike Florida, Texas doesn’t require local health officials to report viral data to the state in a uniform way that would make it easier and faster to process and report.

It could be better.

In Wisconsin, health officials report the number of cases and deaths down to the neighborhood level. They also plainly report racial and ethnic disparities, which show the disease hits Hispanic residents hardest.

Still, Salemi worries that Florida’s lack of answers can undermine residents’ faith.

“My whole thing is the communication, the transparency,” Salemi said. “Just let us know what’s going on. That can stop people from assuming the worst. Even if you make a big error people are a lot more forgiving, whereas if the only time you’re communicating is when bad things happen … people start to wonder.”

Missouri’s COVID-19 data reports send ‘dangerous message to the community,’ say health systems

Marion County reports six additional COVID-19 cases | KHQA

A group of health system leaders in Missouri challenged state-reported hospital bed data, saying it could lead to a misunderstanding about hospital capacity, according to a Nov. 19 report in the St. Louis Business Journal.

A consortium of health systems, including St. Louis-based BJC HealthCare, Mercy, SSM Health and St. Luke’s Hospital, released urgent reports warning that hospital and ICU beds are nearing capacity while state data reports show a much different story.

The state reports, based on data from TeleTracking and the CDC-managed National Healthcare Safety Network, show inpatient hospital bed capacity at 35 percent and remaining ICU bed capacity at 29 percent on Nov. 19. However, the consortium reported hospitals are fuller, at 84 percent capacity as of Nov. 18, and ICUs at 90 percent capacity based on staffed bed availability. The consortium says it is using staffed bed data while the state’s numbers are based on licensed bed counts; the state contends it does take staffing into account, according to the report.

Stephanie Zoller Mueller, a spokesperson for the consortium, said the discrepancy between the state’s data and consortium’s data could create a “gross misunderstanding on the part of some and can be a dangerous message to the community.”

Convalescent Plasma: The Unanswered Questions

https://www.medpagetoday.com/infectiousdisease/covid19/88264?xid=fb_o&trw=no&fbclid=IwAR0F6xiRAQ7ngBz4pNozJ2VqWm0-UJqGdlQojfOeyXbPJjbAeYtGL8jbAiw

“The data don’t show anything useful”

Problems with the government’s rationale for authorizing use of convalescent plasma in COVID-19 patients go far beyond the dustup over the purported 35% survival benefit cited by top officials on Sunday, numerous researchers say.

That figure quickly came under fire, leading to an apology from Commissioner Stephen Hahn, MD — but that’s not the only criticism leveled at the FDA’s analysis of the available data.

Much of it came from the Mayo Clinic and FDA expanded access program (EAP), at this point published only as a preprint manuscript. Although a large number of patients were included, the study was observational only, with no untreated control group. That makes the findings merely hypothesis-generating, and can’t offer any firm conclusions.

That’s fine for issuing an emergency use authorization (EUA), but not so much for making claims about survival benefit, independent researchers said.

“It’s not even a question of overstating,” Adam Gaffney, MD, MPH, a critical care doctor and health policy researcher at Harvard Medical School, told MedPage Today. “You can’t state much at all when you don’t have a randomized controlled trial.”

“People have made a big deal of Hahn referring to relative versus absolute risk reduction, but I think that’s less of a big deal,” Gaffney said. “The biggest problem is that the data they are citing … is not randomized. That’s the source of all the problems.”

Hahn took heat for saying that a “35% improvement in survival is a pretty substantial clinical benefit” further explaining that of “100 people who are sick with COVID-19, 35 would have been saved because of the administration of plasma.”

Critics rapidly took to Twitter, stating that the interpretation was incorrect. Hahn was referring to relative risk reduction, not absolute risk reduction. Thus, calculating the number of lives saved — which isn’t something experts recommend doing based on observational data in the first place — would have translated to somewhere more in the ballpark of 5 out of 100.

Moreover, the “risk reduction” came from a comparison of patients treated with high-titer plasma versus those receiving lower-titer preparations. The study offered no basis for concluding how many patients may have been “saved” relative to standard care.

And the 35% reduction in that analysis was for 7-day mortality; the relative reduction at 30 days was only 23%.

Hahn’s recital of the 35% figure “was just PART of the error,” tweeted Vinay Prasad, MD, MPH, of the University of California San Francisco. “The entire comparison is flawed. It is not a suitable control. The data don’t show anything useful.”

“The much broader problem here is the lack of commitment to performing large, national randomized controlled trials,” Gaffney said. “We could have done it for convalescent plasma. Instead, we did the EAP. I understand why people wanted it, but now we don’t know [if convalescent plasma works]. We have a question mark instead of a period.”

Undermining Trust in FDA?

Critics have charged that serious mistakes like Hahn’s misstatement could undermine FDA’s credibility, especially as it faces challenging decisions about potentially approving a vaccine this fall.

“This is playing out in the context of a hyper-politicized moment,” Gaffney said. “It behooves everyone to be extremely cautious in speaking about these things to avoid the appearance of politicization.”

On CBS This Morning on Tuesday, Hahn addressed concerns about politicization by offering reassurance to the “American people that this decision was made based upon sound science and data.”

In response to questions about the timing of the EUA announcement — it came just a day after President Donald Trump tweeted allegations that the “deep state” was holding back access to COVID-19 treatments with Hahn’s Twitter handle cited, and a day before the Republican National Convention got underway — Hahn said the agency had been working on the application for 3 or 4 weeks and was waiting on additional validation data, which were received at the end of last week and over the weekend.

“We’re going to continue to get data and as we’ve done with any other authorization, we will update that decision as new data come,” Hahn said on the news program. His agency initially issued an EUA for hydroxychloroquine, for instance, but later revoked it when the negative randomized trial data became available.

Lack of Access to FDA’s Data Review

Whether the public will ever see the full convalescent plasma data underlying the EUA is another matter. The “Clinical Memorandum” issued as the evidence behind the FDA’s decision glossed over the statistical analysis conducted by the agency; in particular, it made no mention of the 35% relative reduction in deaths.

Another problem with that is the 35% figure’s source isn’t fully clear. Although the EAP preprint manuscript is the most obvious source, Gaffney noted that HHS Secretary Alex Azar said it referred to a subgroup of patients under age 80 who were not on a ventilator. That is not found in the publicly available data. He also pointed to a tweet by FDA spokesperson Emily Miller that contains an agency slide showing a 37% reduction in mortality for non-intubated patients age 80 or under treated within 72 hours who got high-titer convalescent plasma, compared with low-titer product. Neither of those figures is reflected in the EAP manuscript.

The FDA did not return a request by MedPage Today for the full summary of data reviewed by FDA and any independent statistical analysis done by the agency.

Shmuel Shoham, MD, of Johns Hopkins University in Baltimore, said during a press briefing organized by the Infectious Diseases Society of America that “enormous amounts of data have been generated” from the EAP, in which more than 70,000 patients have been treated.

“Some data have been reported in articles and at meetings, but that’s only part of what the FDA — this is their program — has access to,” he said. “The stuff in the public domain is only a fraction of the data they have collected.”

Shoham is on the scientific advisory board of the EAP and is involved in two convalescent plasma clinical trials at Johns Hopkins.

Gaffney said Mayo researchers and FDA reviewers have noted that physicians were blinded to the dose of antibody given in plasma infusions, which he described as a “pseudo-randomization effect. We could use that to make more causal inferences about the effectiveness of antibody titers.”

However, he said there were some significant differences between those who received high-titer versus low-titer antibody, including differences in P-to-F ratio (a measure of inhaled oxygen to blood oxygen) and in those with five or more severe risk factors, suggesting the low-titer group was sicker to begin with than the high-titer group.

Also, patients in the EAP received a variety of other treatments: about half got steroids and 40% were given remdesivir.

“This is why we do randomized controlled trials,” Gaffney said. “Without them it’s very difficult to ensure that the effect you see is the result of the drug, and not the result of patient characteristics.”

Is an Answer Forthcoming?

Several randomized controlled trials of convalescent plasma are underway in the U.S., but the big concern is that wider access to convalescent plasma will limit enrollment. Will clinicians recommend that their patients enroll in a trial in which they might receive placebo? Will patients agree?

For the Hopkins studies, the prevention trial has enrolled 25 people out of a goal of 500, and its outpatient trial has enrolled 50 people of its 600-patient goal.

Liise-anne Pirofski, MD, of Montefiore Medical Center in New York, started a study at the end of April, looking to enroll 300 people. She said the team enrolled the first 150 people quickly, but “then the pandemic began to wane in New York.” With subsequent funding from the NIH, the trial has managed to enroll 190 patients, and has now expanded to four additional sites: New York University, Yale, the University of Miami, and the University of Texas Houston.

Clifton Callaway, MD, PhD, of the University of Pittsburgh Medical Center and lead investigator on the C3PO trial looking at outpatient convalescent plasma, said he hopes the EUA doesn’t discourage participation.

“To the contrary, I believe it should reassure persons considering participation that the FDA feels that convalescent plasma is safe and potentially useful and that the FDA specifically comments: ‘Current data suggest the largest clinical benefit is associated with high-titer units of CCP administered early in the course of disease.’ Giving high-titer convalescent plasma earlier (before you are sick enough to be in the hospital) is exactly what C3PO is testing.”

In addition to determining whether earlier or prophylactic treatment works, Shoham said other unanswered questions include identifying whether other components in plasma are useful therapies and whether low-titer plasma can work at all.

“What everyone agrees on is that the gaps in knowledge that exist can best be addressed by high-quality randomized controlled trials,” he said.

Pirofski said the science and data should be the focus, “rather than the decision and what drove the decision…. I don’t think anyone knows what drove that decision other than the people in that room. Hopefully they know.”

 

 

 

 

Covid-19 Data in the US Is an ‘Information Catastrophe’

https://www.wired.com/story/covid-19-data-in-the-us-is-an-information-catastrophe/#intcid=recommendations_wired-bottom-recirc-personalized_31e95638-88d6-439c-85a2-db8f6235da26_text2vec1-mab

Covid-19 Data in the US Is an 'Information Catastrophe' | WIRED

The order to reroute CDC hospitalization figures raised accuracy concerns. But that’s just one of the problems with how the country collects health data.

TWO WEEKS AGO, the Department of Health and Human Services stripped the Centers for Disease Control and Prevention of control of national data on Covid-19 infections in hospitalized patients. Instead of sending the data to the CDC’s public National Healthcare Safety Network (NHSN), the department ordered hospitals to send it to a new data system, run for the agency by a little-known firm in Tennessee.

The change took effect immediately. First, the hospitalization data collected up until July 13 vanished from the CDC’s site. One day later, it was republished—but topped by a note that the NHSN Covid-19 dashboard would no longer be updated.

Fury over the move was immediate. All the major organizations that represent US public health professionals objected vociferously. A quickly written protest letter addressed to Vice President Mike Pence, HHS secretary Alex Azar, and Deborah Birx, the coordinator of the White House’s Coronavirus Task Force, garnered signatures from more than 100 health associations and research groups. The reactions made visible the groups’ concerns that data could be lost or duplicated, and underlined their continual worry that the CDC is being undercut and sidelined. But it had no other effect. The new HHS portal, called HHS Protect, is up and running.

Behind the crisis lies a difficult reality: Covid-19 data in the US—in fact, almost all public health data—is chaotic: not one pipe, but a tangle. If the nation had a single, seamless system for collecting, storing, and analyzing health data, HHS and the Coronavirus Task Force would have had a much harder time prying the CDC’s Covid-19 data loose. Not having a comprehensive system made the HHS move possible, and however well or badly the department handles the data it will now receive, the lack of a comprehensive data system is harming the US coronavirus response.

“Every health system, every public health department, every jurisdiction really has their own ways of going about things,” says Caitlin Rivers, a senior scholar at the Johns Hopkins Center for Health Security. “It’s very difficult to get an accurate and timely and geographically resolved picture of what’s happening in the US, because there’s such a jumble of data.”

Data systems are wonky objects, so it may help to step back and explain a little history. First, there’s a reason why hospitalization data is important: Knowing whether the demand for beds is rising or falling can help illuminate how hard-hit any area is, and whether reopening in that region is safe.

Second, what the NHSN does is important too. It’s a 15-year-old database, organized in 2005 out of several streams of information that were already flowing to the CDC, which receives data from hospitals and other health care facilities about anything that affects the occurrence of infections once someone is admitted. That includes rates of pneumonia from use of ventilators, infections after surgery, and urinary tract infections from catheters, for instance—but also statistics about usage of antibiotics, adherence to hand hygiene, complications from dialysis, occurrence of the ravaging intestinal infection C. difficile, and rates of health care workers getting flu shots. Broadly, it assembles a portrait of the safety of hospitals, nursing homes, and chronic care institutions in the US, and it shares that data with researchers and with other statistical dashboards published by other HHS agencies such as the Center for Medicare and Medicaid Services.

Because NHSN only collects institutional data, and Covid-19 infections occur both inside institutions such as nursing homes and hospitals, and in the outside world, HHS officials claimed the database was a bad fit for the coronavirus pandemic. But people who have worked with it argue that since the network had already devised channels for receiving all that data from health care systems, it ought to continue to do so—especially since that data isn’t easy to abstract.

“If you are lucky enough to work in a large health care system that has a sophisticated electronic medical record, then possibly you can push one button and have all the data flow up to NHSN,” says Angela Vassallo, an epidemiologist who formerly worked at HHS and is now chief clinical adviser to the infection-prevention firm Covid Smart. “But that’s a rare experience. Most hospitals have an infection preventionist, usually an entire team, responsible for transferring that data by hand.”

There lies the core problem. Despite big efforts back during the Obama administration to funnel all US health care data into one large-bore pipeline, what exists now resembles what you’d find behind the walls of an old house: pipes going everywhere, patched at improbable angles, some of them leaky, and some of them dead ends. To take some examples from the coronavirus response: Covid-19 hospital admissions were measured by the NHSN (before HHS intervened), but cases coming to emergency departments were reported in a different database, and test results were reported first to local or state health departments, and then sent up to the CDC.

Covid-19 data in particular has been so messy that volunteer efforts have sprung up to fix it. These include the COVID Tracking Project—compiled from multiple sources and currently the most comprehensive set of statistics, used by media organizations and apparently by the White House—and Covid Exit Strategy, which uses data from the COVID Tracking Project and the CDC.

Last week, the American Public Health Association, the Johns Hopkins Center, and Resolve to Save Lives, a nonprofit led by former CDC director Tom Frieden, released a comprehensive report on Covid-19 data collection. Pulling no punches, they called the current situation an “information catastrophe.”

The US, they found, does not have national-, state-, county-, or city-level standards for Covid-19 data. Every state maintains some form of coronavirus dashboard (and some have several), but every dashboard is different; no two states present the same data categories, nor visualize them the same way. The data presented by states is “inconsistent, incomplete, and inaccessible,” the group found: Out of 15 key pieces of data that each state should be presenting—things such as new confirmed and probable cases, new tests performed, and percentage of tests that are positive—only 38 percent of the indicators are reported in some way, with limitations, and 60 percent are not reported at all.

“This is not the fault of the states—there was no federal leadership,” Frieden emphasized in an interview with WIRED. “And this is legitimately difficult. But it’s not impossible. It just requires commitment.”

But the problem of incomplete, messy data is older and deeper than this pandemic. Four scholars from the health-policy think tank the Commonwealth Fund called out the broader problem just last week in an essay in The New England Journal of Medicine, naming health data as one of four interlocking health care crises exposed by Covid-19. (The others were reliance on employer-provided health care, financial losses in rural and primary-care practices, and the effect of the pandemic on racial and ethinic minorities.)

“There is no national public health information system—electronic or otherwise—that enables authorities to identify regional variation in the demand for, and supply of, resources critical to managing Covid-19,” they wrote. The fix they recommended: a national public health information system that would record diagnoses in real time, monitor the materials hospitals need, and link hospitals and outpatient care, state and local health departments, and laboratories and manufacturers to maintain real-time reporting on disease occurrence, preventive measures, and equipment production.

They are not the first to say this is needed. In February, 2019, the Council of State and Territorial Epidemiologists launched a campaign to get Congress to appropriate $1 billion in new federal funding over 10 years specifically to improve data flows. “The nation’s public health data systems are antiquated, rely on obsolete surveillance methods, and are in dire need of security upgrades,” the group wrote in its launch statement. “Sluggish, manual processes—paper records, spreadsheets, faxes, and phone calls—still in widespread use, have consequences, most notably delayed detection and response to public health threats.”

Defenders of the HHS decision to switch data away from the CDC say that improving problems like that is what the department was aiming for. (“The CDC’s old hospital data-gathering operation once worked well monitoring hospital information across the country, but it’s an inadequate system today,” HHS assistant secretary for public affairs Michael Caputo told CNN.) If that’s an accurate claim, during a global pandemic is a challenging time to do it.

“We were opposed to this, because trying to do this in the middle of a disaster is not the time,” says Georges Benjamin, a physician and executive director of the American Public Health Association, which was a signatory to the letter protesting moving data from the NHSN. “It was just clearly done without a lot of foresight. I don’t think they understand the way data moves into and through the system.”

The past week has shown how correct that concern was. Immediately after the switch, according to CNBC, states were blacked out from receiving data on their own hospitals, because the hospitals were not able to manage the changeover from the CDC to the HHS system. On Tuesday, Ryan Panchadsaram, cofounder of Covid Exit Strategy and former deputy chief technology officer for the US, highlighted on Twitter that data on the HHS dashboard, advertised as updating daily, was five days old. And Tuesday night, the COVID Tracking Project staff warned in a long analysis: “Hospitalization data from states that was highly stable a few weeks ago is currently fragmented, and appears to be a significant undercount.”

When the Covid-19 crisis is over, as everyone hopes it will be someday, the US will still have to wrestle with the questions it raised. One of those will be how the richest country on the planet, with some of the best clinical care in the world, was content with a health information system that left it so uninformed about a disease affecting so many of its citizens. The answer could involve tearing the public-health data system down and building it again from scratch.

“This is a deeply entrenched problem, where there is no single person who has not done their job,” Rivers says. “Our systems are old. They were not updated. We haven’t invested in them. If you’re trying to imagine a system where everyone reports the same information in the same way and we can push a button and have all the information we might want, that will take a complete overhaul of what we have.”

 

 

 

 

Covid-19 data is a public good. The US government must start treating it like one.

https://www.technologyreview.com/2020/07/17/1005391/covid-coronavirus-hospitalizations-data-access-cdc/

Data for the public good - O'Reilly Radar

The US has failed to prioritize a highly effective and economical intervention—providing quick and easy access to coronavirus data.

Earlier this week as a pandemic raged across the United States, residents were cut off from the only publicly available source of aggregated data on the nation’s intensive care and hospital bed capacity. When the Trump administration stripped the Centers for Disease Control and Prevention (CDC) of control over coronavirus data, it also took that information away from the public.

 

I run a nonpartisan project called covidexitstrategy.org, which tracks how well states are fighting this virus. Our team is made up of public health and crisis experts with previous experience in the Trump and Obama administrations. We grade states on such critical measures as disease spread, hospital load, and the robustness of their testing. 

 

Why does this work matter? In a crisis, data informs good decision-making. Along with businesses, federal, state, and local public health officials and other agencies rely on us to help them decide which interventions to deploy and when workplaces and public spaces can safely reopen. Almost a million people have used our dashboards, with thousands coming back more than 200 times each.

To create our dashboards, we rely on multiple sources. One is the National Healthcare Safety Network (NHSN), run by the CDC. Prior to July 14, hospitals reported the utilization and availability of intensive care and inpatient beds to the NHSN. This information, updated three times a week, was the only publicly available source of aggregated state-level hospital capacity data in the US.

With 31 states currently reporting increases in the number of hospitalized covid-19 patients, these utilization rates show how well their health systems will handle the surge of cases.

 

Having this information in real time is essential; the administration said the CDC’s system was insufficiently responsive and data collection needed to be streamlined. The US Department of Health and Human Services (HHS) directed hospitals (pdf) to report their data to a new system called HHS Protect.

Unfortunately, by redirecting hospitals to a new system, it left everyone else in the dark. On July 14, the CDC removed the most recent data from its website. As we made our nightly update, we found it was missing. After significant public pressure, the existing maps and data are back—but the agency has added a disclaimer that the data will not be updated going forward.

 

This is unacceptable. This critical indicator was being shared multiple times a week, and now updates have been halted. US residents need a federal commitment that this data will continue to be refreshed and shared.

The public is being told that a lot of effort is going into the new system. An HHS spokesman told CNBC that the new database will deliver “more powerful insights” on the coronavirus. But the switch has rightly been criticized because this new data source is not yet available to the public. Our concerns are amplified by the fact that responsibility for the data has shifted from a known entity in the CDC to a new, as-yet-unnamed team within HHS.

I was part of the team that helped fix Healthcare.gov after the failed launch in 2013. One thing I learned was that the people who make their careers in the federal government—and especially those working at the center of a crisis—are almost universally well intentioned. They seek to do the right thing for the public they serve.

 

In the same spirit, and to build trust with the American people, this is an opportunity for HHS to make the same data it’s sharing with federal and state agencies available to the public. The system that HHS is using helps inform the vital work of the White House Coronavirus Task Force. From leaked documents, we know that reports for the task force are painstakingly detailed. They include county-level maps, indicators on testing robustness, and specific recommendations. All of this information belongs in the public domain.

This is also an opportunity for HHS to make this data machine readable and thereby more accessible to data scientists and data journalists. The Open Government Data Act, signed into law by President Trump, treats data as a strategic asset and makes it open by default. This act builds upon the Open Data Executive Order, which recognized that the data sets collected by the government are paid for by taxpayers and must be made available to them. 

As a country, the United States has lagged behind in so many dimensions of response to this crisis, from the availability of PPE to testing to statewide mask orders. Its treatment of data has lagged as well. On March 7, as this crisis was unfolding, there was no national testing data. Alexis Madrigal, Jeff Hammerbacher, and a group of volunteers started the COVID Tracking Project to aggregate coronavirus information from all 50 state websites into a single Google spreadsheet. For two months, until the CDC began to share data through its own dashboard, this volunteer project was the sole national public source of information on cases and testing.

With more than 150 volunteers contributing to the effort, the COVID Tracking Project sets the bar for how to treat data as an asset. I serve on the advisory board and am awed by what this group has accomplished. With daily updates, an API, and multiple download formats, they’ve made their data extraordinarily useful. Where the CDC’s data is cited 30 times in Google Scholar and approximately 10,000 times in Google search results, the COVID Tracking Project data is cited 299 times in Google Scholar and roughly 2 million times in Google search results.

 

Sharing reliable data is one of the most economical and effective interventions the United States has to confront this pandemic. With the Coronavirus Task Force daily briefings a thing of the past, it’s more necessary than ever for all covid-related data to be shared with the public. The effort required to defeat the pandemic is not just a federal response. It is a federal, state, local, and community response. Everyone needs to work from the same trusted source of facts about the situation on the ground.

Data is not a partisan affair or a bureaucratic preserve. It is a public trust—and a public resource.

 

 

 

 

Healthcare mergers and acquisitions require sensible data sharing strategies, and a solid analytics framework

https://www.healthcarefinancenews.com/news/healthcare-mergers-and-acquisitions-require-sensible-data-sharing-strategies-and-solid?mkt_tok=eyJpIjoiTmpJME5qVTNOVEU1TXpRdyIsInQiOiJDdUIxQ1NKdng1b0FkQ1wvQlwvNFBTc1JIbmVwYUZOeUhCZ3VlNlZzdmhNbkhBQlhnXC9JeTI4c2NDeE80REk0YWJ1Nk1jSzl4QjFDbjFMTkxKdmVCblY1RUlSYTIwUmlhSEJ6VXpkOUZZdytUWDhaV1poaEljcVh5ZFdEOUdVZlQzZyJ9

While it’s important for disparate EHRs to communicate with one another, organizations need a better handle on analytics and dashboards.

Mergers and acquisitions in healthcare have been going along at a pretty good clip for a number of years now. The volume of deals remains high, and with larger entities primed to scoop up some of their smaller, struggling peers, the trend seems poised to continue.

There’s an issue that consolidating organizations consistently run into, however: data sharing.

Specifically, many organizations that have initiated merger activity fail to consider that not only will the consolidation necessitate integrating multiple electronic health records, but other ancillary systems as well.

These organizations need to produce the analytics that are required to manage what’s essentially a new business, and that starts with the development of some sort of analytics blueprint early on in the merger activity.

As two or more forces join into one, it’s important to have the analytics blueprint in place so leaderships knows which dashboards are going to be needed for success.

“There used to be a trend where everyone was converted onto the same EHR platform,” said John Walton, solutions architect for IT consulting company CTG. “I guess the thought is that if everyone is converted, the problem will go away. Now … they end up in a situation where they can’t produce the kind of dashboards that are needed.”

THE FRAMEWORK

A key component of an effective analytics blueprint is a conceptual data model — basically a visual representation of what domains are needed for the dashboards.

“It sounds difficult to produce, but if it takes more than four to six weeks to produce something like that, you’re overthinking it,” said Walton. “But that’s then starting point. The key component is that analytics framework.”

Failure to have a framework in place can result in the newly merged entity losing out in terms of revenue and productivity. And once the problem becomes manifest, there’s often a lot of manual effort that goes into serving, for example, the financial dashboards that are so needed by CFOs. A lot of the manual effort goes into putting the data into Excel spreadsheets, which only puts a Band-Aid on the problem.

“The framework essentially provides pre-built routines to extract data from multiple data sources, as well as from financial systems,” said Walton. “It also provides, for lack of a better term, the data plumbing to enterprise standards, and most importantly there’s an analytic layer. The endgame is that the dashboards need to sit on top of an analytics layer that is easy to do analytics on. What it contains is pre-computed performance indicators based on approved business rules with multiple levels of aggregation.”

An effective framework, as with so many other things, begins with C-suite leadership. Having executive sponsorship, or at least an understanding of the issue at an executive level, can translate into a vision for how to integrate the data and provide the analytics that are needed to successfully manage the business.

PLANNING PROACTIVELY

Walton once observed a national organization that acquired another entity, and after two years the CEO still didn’t have any executive dashboards — which means a lack of visibility into the performance metrics. The CEO hen issued what was effectively a mandate to the acquiring organization: Get this done within three months, or else.

Thus began a flurry of activity to et the dashboard situation straightened out, which is not where an organization wants to be. Proactive planning is essential, yet Walton doesn’t see a lot of that in healthcare.

“I’ve never seen an organization proactively plan for this,” he said. “That doesn’t mean it’s not happening, but in my experience I haven’t seen it.”

In the meantime, mergers and acquisitions keep happening. Even if merging organizations become aware of the problem and factor that into their decision-making there’s another issue to consider.

“Another extremely significant problem is data quality and information consistency,” said Walton. “That problem really is not universally dealt with, in healthcare or for that matter other industries. It’s almost like they’ve learned to live with it. It’s almost like we need a call to arms or something. You’re almost certainly going to have the need for an analytics framework that will apply the data to your standards.”

The data in question could encompass missing or clinically inappropriate data. Quality, in this case, has to do with the cleanliness of the data. In terms of consistency, a good example would be something like average length of stay. There’s an opportunity to ensure that the right data ownership and stewardship is in place.

Importantly, it’s primarily a business solution. It’s possible that one of the merging entities has a data governance strategy, but all too often that strategy was launched by the IT department — which is not where an organization wants to be, said Walton, because it’s primarily a business problem rather than one that’s purely technical.

Data governance is a very well-known concept, but people struggle with its actual implementation for a number of reasons,” he said. “One, there’s a technical aspect of it, which centers around how we identify data quality issues. What kinds of tools are they going to use to address data quality issues?

“Then there’s establishing ownership of the data, and who are the subject matter experts. And there’s a workflow aspect that most organizations fail to deal with.”

It all starts with the framework. Only then can merging organizations get an appropriate handle on its data and analytics landscape.