2020


Invest in What Works
State Standard of Excellence

With the outbreak of the COVID-19 pandemic, governors and state leaders have increasingly turned to data to guide their decision-making. All state governments publicly shared data on their COVID-19 emergency response dashboards, but states with a comprehensive data infrastructure (Criteria 3 Data Leadership; Criteria 4 Data Policies/Agreements; and Criteria 5 Data Use) had an enhanced ability to leverage a data-driven response.

COVID-19 Data Dashboards

The 2019 State Standard of Excellence identified 12 states with significant public-facing performance management systems. However, in 2020, all 50 states launched COVID-19 dashboards to monitor public health data and emergency response services. This alone constitutes a significant step forward for state governments’ use of data-driven policymaking.

Moreover, as of July 2020, 18 governors (nearly all of them from states highlighted in the 2020 State Standard of Excellence) had publicly committed to taking a data-driven and/or evidence-based approach in their COVID-19 response. Through daily press briefings, governors reaffirmed their state’s commitment to a response grounded in facts, science, and data. In regional plans, governors agreed to coordinate re-opening efforts and frequently share data as hotspots emerged. These actions respond to the public demand for data transparency, as well as the public’s desire that their governors and state leaders rely on data to make decisions. A recent national poll commissioned by Results for America and conducted by NORC at the University of Chicago found that 92% of Americans believe policymakers should seek the best evidence and data available when making decisions.

Beyond these efforts, several states took a comprehensive approach by outfitting their data dashboards with highly detailed information on economic, health, and demographic indicators. For example, Colorado’s COVID-19 dashboard features extensive data on hospitals, outbreaks, surveillance efforts, and the incidence and epidemic curve (see Colorado example in Criteria 5). Minnesota’s COVID Response Capacity Tracker monitors hospital surge capacity, child care capacity, and COVID-19 response funding in addition to economic and food security metrics, critical care supplies, dial back indicators, and demographic details (see Minnesota example in Criteria 2). In both states, these data informed governors’ actions and are leveraged to drive everyday decisions during the pandemic.

Minnesota’s race/ethnicity data dashboard publishes demographic COVID-19 data by age, gender, and race/ethnicity; this approach was mirrored in 45 other states that also released demographic data about COVID-19 cases. The use of these disaggregated data allows states to identify disproportionately impacted communities, in particular, communities of color. For example, Virginia’s COVID-19’s Health Equity Work Group (see Virginia example in Criteria 13) leveraged geospatial data to provide testing and personal protective equipment in communities typically underserved by government programs. These data enable states to tailor their response activities in equity-informed ways. States are also encouraging community engagement and civic innovation by publicly posting open data daily on state websites in Massachusetts, Washington, Connecticut, and more.

States Leverage Data to Support Pandemic Response Efforts

All 50 states publicly posted coronavirus data, but states with the ability to link, share, and integrate data were well-positioned to incorporate data insights into their response efforts. For example, InnovateOhio’s statewide data-sharing and integration platform (see Ohio example in Criteria 4), gave Ohio a head start by providing rapid, actionable COVID-19 data to decision-makers. By using an existing infrastructure, the state obtained vital information, such as hospital capacity, more quickly by turning quarterly data deadlines into daily data reports.

Connecticut’s statewide data infrastructure helped the state leverage existing data-sharing agreements to match student and SNAP data, enabling students to directly receive SNAP Pandemic EBT food benefits with no application necessary (see Connecticut example in Criteria 5). Similarly, in administering the federal COVID-19 Pandemic Unemployment Assistance program, Rhode Island’s Department of Labor and Training launched an improved cloud-based system to manage the surge in unemployment claims. This system enabled Rhode Island, a model for other states, to be among the first states in the country to allocate Pandemic Unemployment Assistance benefits in the face of record-high unemployment claims (see Rhode Island example in Criteria 4). Prior to the pandemic, Virginia’s Chief Data Officer spearheaded the creation of a roadmap for enhanced data integration and use that allowed an existing cross-agency data platform, initially launched to coordinate the opioid crisis , to be enhanced for the COVID-19 response (see Virginia examples in Criteria 3, 4, and 5). These efforts demonstrate how states were able to use existing data infrastructure to rapidly provide or expand services and benefits during the COVID-19 crisis.

States have also positioned themselves to leverage evidence and data infrastructure as part of the budget process. This approach to using evidence in fiscal decisions can foster innovation, build evidence of program effectiveness, and fund evidence-based programs that improve services for residents. With many states facing unprecedented budgetary decisions as a result of the COVID-19-induced recession, some states featured in the State Standard of Excellence have developed the capacity to invest in evidence-based programs to deliver better outcomes for residents.

For example, Colorado’s FY 2020-2021 budget development instructions (pp. 10-12) required agencies to describe the evidence and body of research supporting any new program (see Colorado example in Criteria 9). Minnesota also issued guidance on identifying evidence in budget proposals, which led to $87 million invested in new or expanded evidence-based programs in the FY 2020-2021 budget (see Minnesota example in Criteria 9). In 2019, the New Mexico legislature defined four tiers of evidence and required select agencies to identify their investments in evidence-based programs. As a result, the state legislature recommended more than $13.6 million in additional spending for evidence-based human services programs (see New Mexico example in Criteria 9).

Tennessee, North Carolina, and Rhode Island demonstrated progress in leveraging evidence in their budget processes. North Carolina’s budget process encouraged departments to identify and develop evidence-focused budget proposals (see North Carolina example in Criteria 9). Tennessee’s Office of Evidence and Impact, founded in 2019, spearheaded an increased use of evidence and research in the state budget process (see Tennessee examples in Criteria 6 and Criteria 9). In Rhode Island, the FY 2021 budget instructions required agencies to describe program effectiveness and improvement of outcomes, rather than simply reporting activities and outputs (see Rhode Island example in Criteria 9). Collectively, these examples signaled a growing commitment across the country to leverage evidence in budget decisions in order to achieve improved results.

Criteria
Promising Examples
1. Strategic Goals Did the Governor have public statewide strategic goals?
Relevant Blueprint Theme:

See the Managing for Results theme in the Blueprint to learn how states can make progress in meeting this criteria.

Launched in 2019, Colorado’s Governor’s Dashboard outlines four high-priority strategic goals: tax reform and economic development, energy and renewables, health, and education and the workforce.
2. Performance Management / Continuous Improvement Did the state or any of its agencies implement a performance management system aligned with its statewide strategic goals, with clear and prioritized outcome-focused goals, program objectives, and measures; and did it consistently collect, analyze, and use data and evidence to improve outcomes, return on investment, and other indicators of performance?
Relevant Blueprint Theme:

See the Managing for Results theme in the Blueprint to learn how states can make progress in meeting this criteria.

In response to COVID-19, the state built a comprehensive public data dashboard that tracks health and economic data, including response data on hospital capacity, critical care supplies, child care, and funding.
3. Data Leadership Did the governor’s office or any state agency have a senior staff member(s) with the authority, staff, and budget to collect, analyze, share, and use high-quality administrative and survey data—consistent with strong privacy protections— to improve (or help other entities including, but not limited to, local governments and nonprofit organizations improve) federal, state, and local programs? (Example: Chief Data Officer)
Relevant Blueprint Theme:

See the Leveraging Data theme in the Blueprint to learn how states can make progress in meeting this criteria.

A 2018 Connecticut law formalized the position of Chief Data Officer within the Office of Policy and Management and created the Connecticut Data Analysis Technology Advisory Board.
4. Data Policies / Agreements Did the state or any of its agencies have data-sharing policies and data-sharing agreements—consistent with strong privacy protections—with any nonprofit organizations, academic institutions, local government agencies, and/or federal government agencies that were designed to improve outcomes for publicly funded programs, and did it make those policies and agreements publicly available? (Example: data-sharing policy, open data policy)
Relevant Blueprint Theme:

See the Leveraging Data theme in the Blueprint to learn how states can make progress in meeting this criteria.

In April 2019, Ohio’s Governor signed an executive order consolidating state data systems into the InnovateOhio Platform, which uses data as “a shared strategic asset” whose “value is multiplied when data sets are linked across programs and organizations” through data integration and management tools.
5. Data Use Did the state or any of its agencies have data systems consistent with strong privacy protections that linked multiple administrative data sets across state agencies, and did it use those systems to improve federal, state, or local programs?
Relevant Blueprint Theme:

See the Leveraging Data theme in the Blueprint to learn how states can make progress in meeting this criteria.

The Indiana Management Performance Hub (MPH), overseen by the state’s Chief Data Officer, houses the integrated Education and Workforce Development database, which brings together data from 12 state agencies, including: the Commission for Higher Education, Department of Education, Department of Health, Department of Corrections, Department of Workforce Development, and Family and Social Services Administration.
6. Evaluation Leadership Did the governor’s office or any state agency have a senior staff member(s) with the authority, staff, and budget to evaluate its major programs and inform policy decisions affecting them? (Example: Chief Evaluation Officer)
Relevant Blueprint Theme:

See the Building and Using Evidence theme in the Blueprint to learn how states can make progress in meeting this criteria.

Founded in 2019, Tennessee’s Office of Evidence and Impact is led by the state’s Director of Evidence and Impact. To propagate Tennessee’s evidence-based budgeting efforts, the Office defined four tiers of evidence, conducted program inventories, developed evidence reviews, and provided evidence-building technical assistance.
7. Evaluation Policies Did the state or any of its agencies have an evaluation policy, evaluation plan, and research/learning agenda(s), and did it publicly release the findings of all completed evaluations?
Relevant Blueprint Theme:

See the Building and Using Evidence theme in the Blueprint to learn how states can make progress in meeting this criteria.

The Kentucky Center for Statistics (KYSTATS) has a 2020-22 Research Agenda, which details four primary research areas to identify barriers to education and workforce opportunities: (1) expand data access and data use to inform equity issues; (2) evaluate outcomes and barriers for education and workforce programs over time; (3) connect supply and demand of the state’s future workforce; and (4) measure the impact of out-of-state education and workforce migration.
8. Evaluation Resources Did the state or any of its agencies invest at least 1% of program funds in evaluations?
Relevant Blueprint Theme:

See the Building and Using Evidence theme in the Blueprint to learn how states can make progress in meeting this criteria.

A 2017 law created Minnesota’s Opiate Epidemic Response grant program. The FY 2021 budget includes $300,000 for Minnesota Management and Budget to conduct experimental and quasi-experimental design impact evaluations for opiate epidemic response grant activities, which is slightly more than 1.1% of the agency’s $27 million general fund budget.
9. Outcome Data Did the state or any of its agencies report or require outcome data for its state-funded programs during their budget process?
Relevant Blueprint Theme:

See the Building and Using Evidence theme in the Blueprint to learn how states can make progress in meeting this criteria.

The 2013 Colorado State Measurement for Accountable, Responsive and Transparent Government (SMART) Act required all Colorado state agencies to submit annual performance reports to the state legislature as part of the state’s budget process.
10. Evidence Definition and Program Inventory Did the state or any of its agencies release a common evidence framework, guidelines, or standards to inform its research and funding decisions and make publicly available an inventory of state-funded programs categorized based on at least two tiers of evidence?
Relevant Blueprint Theme:

See the Building and Using Evidence theme in the Blueprint to learn how states can make progress in meeting this criteria.

A 1999 New Mexico law required all state agencies to submit annual performance-based budget requests that include outputs, outcomes, performance, and evaluation data.
11. Cost-Benefit Analysis Did the state or any of its agencies assess and make publicly available the costs and benefits of public programs?
Relevant Blueprint Theme:

See the Investing for Results theme in the Blueprint to learn how states can make progress in meeting this criteria.

A 2013 Washington State law directed the Department of Corrections, in consultation with the Washington State Institute for Public Policy (WSIPP), to: (1) compile an inventory of existing programs; (2) determine whether its programs were evidence-based; (3) assess the effectiveness of its programs, including conducting a cost-benefit analysis; and (4) phase out ineffective programs and implement evidence-based programs.
12. Use of Evidence in Grant Programs Did the state or any of its agencies (1) invest at least 50% of program funds in evidence-based solutions or (2) use evidence of effectiveness when allocating funds to eligible grantees (including local governments) from its five largest competitive and noncompetitive grant programs?
Relevant Blueprint Theme:

See the Investing for Results theme in the Blueprint to learn how states can make progress in meeting this criteria.

Since 2017, the Nevada Department of Education has allocated 100% of the state’s $9.5 million in federal Title I school improvement funds to districts and schools for interventions backed by strong, moderate, or promising evidence (using the top three tiers of evidence as defined by the federal Every Student Succeeds Act (ESSA)).
13. Innovation Did the state or any of its agencies have staff, policies, and processes in place that encouraged innovation to improve outcomes?
Relevant Blueprint Theme:

See the Investing for Results theme in the Blueprint to learn how states can make progress in meeting this criteria.

In 2020, California launched the California COVID Assessment Tool to identify potential COVID-19 hotspots, predict which hospitals might reach capacity, and proactively allocate resources to such hotspots.
14. Contracting for Outcomes Did the state or any of its agencies enter into performance-based contracts and/or use active contract management (frequent use of data and regular communication with providers to monitor implementation and progress) to improve outcomes for publicly funded programs?
Relevant Blueprint Theme:

See the Investing for Results theme in the Blueprint to learn how states can make progress in meeting this criteria.

Since 2015, Rhode Island’s Department of Children, Youth, and Families has worked to reform and restructure the department’s procurement.
15. Repurpose for Results Did the state or any of its agencies shift funds away from any practice, policy, or program that consistently failed to achieve desired outcomes?
Relevant Blueprint Theme:

See the Investing for Results theme in the Blueprint to learn how states can make progress in meeting this criteria.

Since 2013, the Pennsylvania Department of Corrections has set performance targets for its community corrections program through performance-based contracts.

Results for America’s analysis is based on data provided under license by The Pew Charitable Trusts’ Results First initiative (which was used to inform its 2017 report on states’ engagement in evidence-based policymaking) and input from more than 150 current and former state government officials and other experts.

Results for America thanks all the leaders that contributed to the 2020 State Standard of Excellence. Please see the full list of Acknowledgements here.