Making Meaning out of Data

Real time data is a powerful tool for systems change, but only if it’s used to inform decisions. See how to leverage insights from evaluation data to drive improvements in program delivery.

Illustration of two seated people having a conversation
Making Decisions based on Real Time Data Comparisons
“Many of the reports and the data that we use are so old that ... it’s going to take two years for the next report to show all the effort that we’ve put in … And that’s really challenging if we’re trying to have impact sooner rather than later.”
SNAP administrator in Louisiana

Progress Over Time

Tracking progress over time gives administrators a sense of whether different measures of performance are improving and whether those trends are stable. Administrators can see whether a strategic change appears to successfully impact program delivery and whether there are important cyclical patterns that might suggest fine-tuning operations.

Demographic Breakdowns

Viewing key metrics broken out by demographic indicators like race, language, and household type allows administrators to evaluate outcomes with an equity lens and observe if there are notable disparities between subpopulations that should be explored and addressed.


Without meaningful comparisons, it’s difficult to know if performance is good or bad. Choosing a benchmark for comparing data helps give important context. You can compare to a formal mandate (e.g. application processing timelines), industry standards (e.g. maximum call abandonment rates or call center wait times), or even the average or best performance of your own state or peers (e.g. national average error rate or your state’s average monthly application processing time).

Business Process Breakdowns

Real time data can reveal how accessible or challenging the application process is for users, depending on the business process channel they choose. For example, reviewing the number of applications submitted via paper, desktop computer, landline phone, and mobile device is a strong indication of how accessible the service is for applicants who rely on online service, or who only have access to a mobile device for their internet access. Comparing outcomes across these access points helps to ensure that someone who applies for benefits on their mobile phone has the same chance of getting approved as someone who applies any other way.

Comparisons Across Jurisdictions 

The ability to break out key metrics and outcomes by state, county, or even local office helps administrators quickly identify bright spots and challenges in specific jurisdictions or counties. These charts are easiest to understand when metrics are sorted in ascending or descending order. To facilitate the most meaningful comparisons, it can make sense to group the data by an additional variable, such as counties with very large or very small caseloads. This can even create opportunities for peer-to-peer learning.


Another way to show comparisons between jurisdictions is a map. Maps show at a glance how regions are performing relative to one another. Major differences across jurisdictions may require further analysis to understand why these disparities exist.

Using Data to Drive Change
“We used to get a lot of comments from staff that we only cared about the numbers … What we tried to do is redirect staff to understand in this culture shift that, yes, we care about the numbers because every single one of those numbers is a person. It is a person with a document, it is a person on the phone, it is a person who is waiting. And they matter, so the numbers matter.”
SNAP director in Washington

Data-Driven Implementation

Data alone does not drive change. In order to be meaningful, data has to be used consistently and in an ongoing way to inform program decisions. In short, how the data is used is what drives change at the organizational level.

In a data-driven institution, organizational decisions are made based on actual data rather than observations, estimates, or intuition. Establishing a data-driven model for safety net delivery requires the right data infrastructure and organizational processes to be in place so that an agency can effectively collect and organize data, run and write reports, and use data in the context of management, decision making, and partnerships.

Being data-driven requires easy access to data for staff across the organization. Data should be stored in databases that can be easily queried, not in PDFs or text files, and should be continuously updated to reflect real time insights. Tools like live data dashboards create visibility and access for staff beyond data and IT practitioners. Being able to use data also requires that the data is reliable and high quality. The way data is collected and stored should aim to reduce error (missing fields, for example), increase granularity and specificity (individual level data rather than aggregates), and reduce bias. The University of Chicago’s Center for Data Science and Public Policy developed a Data Maturity Framework to help organizations assess and improve their data practices and processes.

Creating a Data Culture

Moving to a culture of being data-driven requires more than investing in data infrastructure, maintenance, and usage processes. Shifting organizational culture around data is a critical step towards creating effectively data-driven systems.

This requires shifting from a culture in which only a few individuals deeply understand the data and what can be done with it, to one where staff all across the organization feel a connection to data and understand why it is critical to their work. In many organizations, data is only written into a few job descriptions and the majority of staff come to think of data as someone else’s job (“I always just ask Dave to pull reports for me.”). Having a basic understanding of data—including what kinds of data exist, how it is stored and managed, and how to ask for needed data—should be a part of every individual’s job. The organization must empower staff to feel confident and comfortable in their ability to work with the data the organization collects and uses.

Ready to measure what matters?