searchSearch data by region...

Pandemic Data Initiative FAQ

As associate vice provost for Public Sector Innovation at Johns Hopkins University, Beth Blauer oversees initiatives that help governments deploy data to maximize the delivery of public services. Blauer says she wants the Pandemic Data Initiative (PDI) to spur discussions about building a modern data collection infrastructure that would empower local, state, and federal agencies to execute a coordinated response for the next major public health crisis.

Share
Authors:
Joshua E. Porterfield, PhD

What is the Pandemic Data Initiative?

We started the Pandemic Data Initiative because we realized there was a lack of a voice around what COVID-19 data is depicting and how the data exists as a resource for lay people, academics, the media, and policy makers.

There is a lot of complexity in the way COVID-19 data has been collected and reported that we haven’t been able to fully contextualize because of a lack of uniformity in how state and federal agencies manage it. The PDI aims to explain how the data got to where it is and explore the opportunities for creating more high value public data sets.

What has been the biggest challenge during the pandemic for effectively using data?

In the absence of a federal standard for collecting and reporting data, state and local governments were left to their own devices to determine how data is shared and how it is used to influence public policy.

State and local governments experienced a lot of confusion because there are so many different approaches for how to use the data. That confusion did not help support the most effective responses to this public health crisis.

What are the consequences of such a hodgepodge data management effort?

In the absence of comprehensive, standardized data, we still don’t know who has been underrepresented in testing data. We still don’t know where real barriers to testing access existed to determine who was being infected with COVID-19.

But we do know that testing wasn’t equally distributed across the country, and we know there are large portions of the population that didn’t have access to testing.

Because there was so much inconsistency in the way that the data has been reported, we couldn’t use it to pinpoint the greatest needs in order to provide highly targeted advice to state and local governments about effectively deploying resources. And that has been extended throughout the entire arch of the pandemic. From testing data to vaccination rollouts.

How have issues of inconsistent data impacted vaccination rollouts?

We still don’t know who is getting access to vaccines. And that leads to more complications for dealing with the last mile of vaccination outreach to hard-to-reach communities and to those who are hesitant to get vaccinated. As vaccine supply is outpacing demand it is critically important that we know who to reach out to, who is seeking out vaccines and who is not, and how best to target public education in communities with high levels of vaccine hesitancy.

We don’t even know who is still trying to find a vaccine and who is still having difficulty. There are people who may not have a computer or broadband access but who still very much want to get vaccinated, but if governments do not know where they are they can’t connect them with sign-up services that are entirely online.

If cities had access to data showing exactly who wanted vaccines but couldn’t access them they would be able to deploy mobile units to try to reach those communities. Instead they are stuck using anecdotal information on how to deploy those resources rather than being super targeted.

Even at federally funded FEMA mass vaccination sites it’s unclear whether the data of who’s getting vaccinated is being provided to local government. And local governments are the frontlines for containing COVID-19.

All of that is an opportunity for data to help lead these decisions on how best to deploy limited resources of the government.

Are local, state, and federal agencies willing to develop standardized methods?

I work with mayors all over the nation. They are desperate for support in this area. They want information. Here in Baltimore, Mayor Brandon Scott consistently says he needs information to understand who in Baltimore is trying to get vaccinated and who has been vaccinated.

If the data were in better shape it could really influence the deployment of limited public resources. And he’s not alone. There are mayors all over the country who are really struggling because they know the data could help them identify pockets where the disease is likely to emerge. We know vaccines save lives and can stave off some of the most severe impacts of COVID.

‘If we have to wait for people to get sick to figure out where to target government outreach it’s too late.’

What are some impediments to achieving data uniformity to be prepared for the next major public health crisis?

There is a lot of health data exchanged between state and federal governments, but the collection of it is antiquated. You can get comorbidity data by state but there is a two-year lag for when it gets published.

It is a huge undertaking to bring that data in, to clean it, and to get it back out to make it public. There are better ways to do it. A lot of the processes for getting and sharing data throughout the government are old. Their frameworks need to be renewed.

What are some possible ways to achieve this goal?

We have to establish better incentives for creating more powerful data collection and sharing methods.

That could be some sort of block grant or other support or there could also be punitive incentives. But there needs to be more motivation now that we have seen over the past 17 months how the lack of data can impede life-saving responses to public health crises. This is not a problem that is going to disappear. It’s only going to get worse as we continue to confront significant public health challenges.

The federal government needs to more aggressively promote their ability to help state and local governments. The federal government has some of the smartest data thinkers in places like the Census Bureau, the CDC, and the U.S. Digital Service. There are plenty of people throughout the federal government who know how to do this.

One thing I’ve learned over the last year is that government wants to be better at this. We’ve had governors and mayors calling the Centers for Civic Impact telling us they want to improve their data and they want help to get their data efforts into shape.

It’s not for a lack of wanting to do a better job. It’s really for a lack of the capacity to understand how to do a better job. And a lot of that was driven by the fact that there were no standards so cities and states were making gut calls on how to do it. Some of them got it right, some of them didn’t. Some just didn’t have the infrastructure in place. So they had to stand up the infrastructure and do the collection at the same time.

How can the Pandemic Data Initiative help?

We’ve done a really great job at the Coronavirus Resource Center pairing our views of the data with the subject matter experts in public health at Johns Hopkins.

But the university also has very rich expertise in data use and the application of data use in a public context. The PDI is our opportunity to bring in that expertise and try to improve the landscape so that we’re better prepared for the next big public health crisis.

We have a team that has been working with local and federal government for years trying to tackle some of the biggest entrenched challenges for data use in government. The PDI is a place where we could provide a broader platform for that expertise.