The Perils of Confusing Performance Measurement with Program Evaluation

A group of researchers recently published a paper critiquing the child outcomes performance indicator for Part C and Part B 619. They also presented some of their thoughts in a recent webinar sponsored by the Association of University Centers on Disabilities (AUCD). The researchers’ critique is based on several faulty assumptions and consequently unfairly discredits the system for measuring child outcomes and the use of the data. Let’s look at our concerns with their critique.

First, the authors have confused performance measurement with program evaluation.

Their primary argument is that the child outcomes measurement requirement produces misleading information because it is based on a flawed evaluation design. The researchers’ critique wrongly assumes that the child outcomes indicator is designed as an evaluation. The child outcomes measurement is not a program evaluation; it is one performance indicator embedded within a larger performance measurement system that is required by the Individuals with Disabilities Education Act (IDEA). States report on a number of performance indicators that address compliance with federal regulations and program results. As such, these indicators yield information that supports program improvement and ongoing monitoring of program performance. Performance measurement systems are common in both the public (for example, Maternal and Child Health) and the private sector (for example, the Pew framework for home visiting). The Office of Special Education Programs (OSEP) implemented the child outcomes indicator in response to the Government Performance and Results Act which requires all federal agencies report on results being achieved by their programs. OSEP also uses the child outcomes indicator data to monitor states on results achieved, consistent with the strong emphasis in IDEA to improve results for children with disabilities.

The Government Accounting Office has produced a succinct summary that highlights some of the differences between the performance measurement and program evaluation. Performance measurement refers to ongoing monitoring and reporting of program accomplishments. Performance measures may address program activities, services and products, or results. The OSEP child outcomes indicator is a performance measure that addresses results. Examples of other results performance measures are teen pregnancy rates, percentage of babies born at low birth weight, 3rd grade reading scores, and high school graduation rates. In contrast, program evaluationsare periodic or one time studies usually conducted by experts external to the program and involve a more in depth look at a program’s performance. Impact evaluations are a particular type of program evaluation that determine the effect of a program by comparing the outcomes of program participation to what would have happened had the program not been provided.

Performance Measurement Compared to Program Evaluation

Feature Performance Measurement Program Evaluation
Data collected on a regular basis, e.g.,  annually Yes No
Usually conducted by experts to answer a specific question at a single point in time No Yes
Provides information about a program’s performance relative to targets or goals Yes Possibly
Provides ongoing information for program improvement Yes No
Can conclude unequivocally that the results observed were caused by the program No Yes, if well designed impact evaluation
Typically quite costly No Yes

A major difference between measuring outcomes in a performance measure system versus a program evaluation is that a well-designed impact evaluation is able to conclude unequivocally that the results observed were caused by the program. Performance measures cannot rule out alternative explanations for the results observed. Nevertheless, performance measurement data can be used for a variety of purposes including accountability, monitoring performance, and program improvement. Data on performance measures such as the Part C and Part B Section 619 child outcomes indicator can be used to track performance compared to a target or to compare results from one year to the next within programs or states. They can be used to identify state or local programs that could benefit from additional support to achieve better results. Comparing outcomes across states or programs should be done with an awareness that they might serve different population which could contribute to different outcomes. The solution to this is not to conclude that results data are useless or misleading but rather to interpret the results alongside other critical pieces of information such as the performance of children at entry to the program or the nature of the services received. Two of OSEP’s technical assistance centers, the Center for IDEA Early Childhood Data Systems (DaSy) and the Early Childhood Technical Assistance Center (ECTA, have developed a variety of resources to support states in analyzing child outcomes data including looking at outcomes for subgroups to further understand what is contributing to the results observed. Just like tracking 3rd grade reading scores or the percentage of infants who are low birth weight, there is tremendous value in knowing how young children with disabilities are doing across programs and year after year.

Second, the authors incorrectly maintain that children who did not receive Part C services would show the same results on the child outcomes indicator as children who did.

The researchers’ claim that the results states are reporting to OSEP would be achieved even if no services had been provided rests on a flawed analysis of the ECLS-B data, a longitudinal study of children born in 2001. For their analysis, the authors identify a group of 24 months olds in the data set who they label as “Part C eligible children who did not receive Part C services.” These children

  • Received a low score on a shortened version of the Bayley Scales of Infant Development (27 items) administered at 9 months of age by a field data collector; and
  • Were reported by a parent when the child was 24 months old as not having received services to help with the child’s special needs.

Few would argue that the determination of eligibility for Part C could be replicated by a 27-item assessment administered by someone unfamiliar with infants and toddlers with disabilities. Furthermore, data from the National Early Intervention Longitudinal Study show that very few children are identified as eligible for Part C based on developmental delay at 9 months of age. The first problem with the analysis is assuming all of these children would have been Part C eligible. The second problem is that it is impossible in this data set to reliably identify which children did and did not receive Part C services. Parents were asked a series of questions about services in general; they were not asked about Part C services. As we and others who have worked with national data collections have learned, parents are not good reporters of program participation for a variety of reasons. The only way to confirm participation in Part C services is to verify program participation which the study did not do. Given that children who received Part C services cannot be identified in the ECLS-B data, no one should be making conclusions about Part C participation based on this data set.

The authors also argue that a measurement phenomenon called “regression to the mean” explains why Part C and Part B 619 children showed improved performance after program participation. In essence this argument says that improvements seen in the functioning of the children are not real changes but are actually due to measurement error. One can acknowledge the reality of errors in assessment results but to maintain that measurement error is the sole or even a major explanation for the progress shown by children in Part C and Part B 619 programs is absurd.

Moving Forward

State Part C and 619 programs are required by IDEA to report on multiple performance indicators including child outcomes as part of a larger performance measurement system. The child outcomes indicator was developed with extensive stakeholder input in order to maximize its utility to local programs, state agencies, and the federal government. The process of building the infrastructure needed to collect and use child outcomes data has been complex which is why states have been working on it for over ten years. State agencies continue to identify and implement strategies for improving the data collection and use of the data. We know that the data collection processes are not perfect and more work needs to be undertaken to address data quality and other concerns. Building a national system for measuring the outcomes for young children with disabilities receiving IDEA services is a long-term undertaking that requires ongoing effort to make the process better. Disparaging the performance indicator and the data reported by states based on incorrect assumptions and flawed analyses is not productive. Instead, the field needs to collectively engage in ongoing dialogue around critical issues of data quality, data analysis, and appropriate use of the data based on an informed understanding of what the child outcomes indicator is and is not. Part C and Part B 619 state agencies and OSEP are on the forefront of collecting and using early childhood outcomes data to improve programs – which is exactly what performance measurement is intended to do.

Source: DaSy: The Center for IDEA Early Childhood Data Systems

Available at: http://dasycenter.org/the-perils-of-confusing-performance-measurement-with-program-evaluation/ 

Office of Head Start Upcoming Events

Explore and register for upcoming T/TA events, sorted by topic. Scroll down for General Interest; Education & Child Development; Family & Community Engagement; Financial & Program Management; Health & Social and Emotional Well-being; Partnerships in Education & Child Care; and Non-ACF Events in the Early Childhood Field.

To see events sorted by date, visit the Early Childhood Learning and Knowledge Center (ECLKC).

 

General Interest

Monday, March 12
4–4:45 p.m. ET
Online

MyPeers Orientation

Join this webinar for a 45-minute introduction to MyPeers, a community of practice forum for Head Start programs, staff, and partners. MyPeers is a virtual space for brainstorming, exchanging ideas, and sharing resources. Local program staff across the country can connect with and lend support to fellow early childhood colleagues.

Webinar Repeats (all ET): March 19 at 1 p.m.; April 12 at 2 p.m.; April 23 at 3 p.m.; May 8 at noon.; May 16 at 2 p.m.

Education & Child Development

Wednesday, March 7
3–4 p.m. ET
Online

Spotlights on Innovative Practices: Relationship-Based Competencies for Professionals Who Work with Young Children

This is a live repeat of the December webinar which introduced the updated resource Relationship-Based Competencies for Professionals Who Work with Young Children in Group Settings.

Divider

Tuesday, March 13
3–4 p.m. ET
Online

BabyTalks Series: Supporting Children’s Early Brain Development

For very young children, almost every experience is an opportunity for learning. Explore how children’s brains develop in the first few years of life.

Divider

Friday, March 16
3–4 p.m. ET
Online

Preschool Cognition: Supporting Early Math

Join this Teacher Time webisode to hear from experts about early math development. Learn how to integrate early geometry concepts and skills, like shapes and puzzles, into everyday teaching practices.

Divider

Tuesday, March 20
3–4 p.m. ET
Online

New and Revised: Making It Work – Implementing Cultural Learning Experiences in AIAN Early Learning Settings

Discover the importance of infusing language and culture in early learning programs. Hear about the newly updated Making It Work, a guide for implementing cultural learning experiences in American Indian and Alaska Native (AIAN) programs.

Divider

Family & Community Engagement

Thursday, March 29
3–4:15 p.m. ET
Online

Helping Families Prepare for Income Changes Throughout the Year

Nearly two-thirds of low-income families go through significant changes in household income during the year. Head Start and Early Head Start programs can play a key role in helping families develop a plan to handle sudden income changes. This webinar is part of the Building Foundations for Economic Mobility (BFEM) webinar series.

Divider

Financial & Program Management

Thursday, March 8
3–4 p.m. ET
Online

Program Planning and Data & Evaluation

This session will give an overview of the Program Planning and Data and Evaluation sections of the Head Start Management Systems Wheel. Topics will include coordinated approaches and how data supports continuous improvement.

Divider

Wednesday, March 28
3–4:30 p.m. ET
Online

Successful, Supportive Relationships with State Early Childhood Systems

Explore both grantee and state perspectives on building relationships that support access to the Child Care and Development Fund subsidy. Hear from state representatives and two Early Head Start-Child Care Partnership grantees, one rural and one urban, about the benefits of these relationships and what steps they took in building them. This webinar is part of the “Making Strides” series.

Divider

Thursday, April 12
3–4 p.m. ET
Online

Facilities and Learning Environments

This session continues the exploration of the Head Start Management Systems Wheel. Review key considerations in facilities management. This includes an overview of the facility development and renovation cycle, as well as the health and wellness implications in facility management.

Divider

Thursday, May 10
3–4 p.m. ET
Online

Transportation and Technology

This Head Start Management Systems Wheel session will address the fundamental concepts that support the systems of Transportation and Technology and Information Systems. This will include transportation planning, ensuring child safety, and the role of internal staff and external consultants in supporting your computers and software.

Divider

Health & Social and Emotional Well-being

Monday, March 5
2–3 p.m. ET
Online

Tummy Time: A Simple Concept with Enormous Benefits

Tummy time gives babies a chance to stretch and strengthen their muscles, which helps them push up, roll over, crawl, and walk. Join this webinar to explore a new suite of materials for home visitors and other professionals working with families with infants. Learn to encourage and incorporate tummy time into families’ routines. Help caregivers use tummy time as a special chance to bond and interact with babies.

Divider

Tuesday, March 6
1–2 p.m. ET
Online

Implementing Evidence-Based Hearing Screening Practices for Children 3 to 5 Years of Age in Head Start Programs

Learn about evidence-based hearing screening for children 3–5 years of age. Explore newly released instructional resources designed to assist those using Pure Tone screening.

Divider

Thursday, March 15
2–3 p.m. ET
Online

Nutrition Education in the Classroom

Nutrition is key for children’s healthy development, but it can be challenging to make it a part of your daily routine. Explore tips and strategies to create healthier eating environments for children in the classroom and at home.

Divider

April 10–12
All Day
Dallas, TX

I Am Moving, I Am Learning Team Trainings

I Am Moving, I Am Learning (IMIL) is a Head Start program enhancement created to address childhood obesity. It was not designed as a curriculum or an add-on. Join the team training to find out how IMIL fits seamlessly into what programs are already doing to meet the Head Start Early Learning Outcomes Framework. Apply online by March 9, 2018.

Divider

Partnerships in Education & Child Care

Tuesday, March 6
2–3:30 p.m. ET
Online

Strategies for Building and Financing the Supply of High Quality Early Learning Webinar Series: State and Local Finance Strategies

The National Center on Early Childhood Quality Assurance, in collaboration with the BUILD initiative, will facilitate a discussion about state and local revenue-generation strategies that fund quality services for children.

Divider

Tuesday, May 1
2–3:30 p.m. ET
Online

Strategies for Building and Financing the Supply of High Quality Early Learning: Utilizing Grants and Contracts, Payment Rates, and Financial Incentives to Increase Supply and Improve Quality

Hear from states that have used different strategies related to provider payments, grants and contracts, and financial incentives.

May 30 – June 1
All Day
Washington, DC

Research and Evaluation Conference on Self-Sufficiency (RECS)

Explore the latest findings from evaluations or programs, policies, and services that support low-income and vulnerable families on the path to economic self-sufficiency. RECS is presented by the Office of Planning, Research, and Evaluation (OPRE), Administration for Children and Families (ACF), U.S. Department of Health and Human Services (HHS).

Non-ACF Events in the Early Childhood Field

April 4–6
All Day
Online

Divider
Divider

April 23–27
All Day
Anaheim, CA

Visual Storytelling for Social Change

We understand our world through stories: the heroes we aspire to be, the conflicts we identify with, the ideas that move us. Visual storytelling—whether a photo series, an online video, a long-form documentary or virtual reality—can capture our attention, generate deep empathy, and move us to take action.

The resources below are designed to help both seasoned and budding social change activists imagine and design stories that boost attention to issues, engage audiences more deeply, and increase the influence of campaigns.

Source: The Culture Lab

Available at: http://theculturelab.org/visual-storytelling-for-social-change/

The Integration of Early Childhood Data: State Profiles and a Report from the U.S. Department of Health and Human Services and the U.S. Department of Education 

12/8/2016

The U.S. Departments of Health and Human Services (HHS) and Education (ED) announced the release of a report that will help states refine their capacity to use existing administrative data from early childhood programs to improve services for young children and families. The report covers key considerations when states integrate data and highlights progress in eight states that are actively developing and using early childhood integrated data systems (ECIDS). The report discusses technical assistance and other resources available to states as they develop their ECIDS.

Source: U.S. Department of Health and Human Services and the U.S. Department of Education, Early Childhood Development, Administration for Children and Families

Available at: http://www.acf.hhs.gov/ecd/early-childhood-data

Using Data to Measure Performance of Home Visiting

10/12/2015

Across the country, state legislatures are turning to evidence-based policymaking to ensure that taxpayer dollars are spent efficiently and effectively. One example is family support and coaching. In response to research confirming that the early years of childhood affect learning, behavior, and health for a lifetime, many states have invested in these programs, commonly referred to as “home visiting.” Evidence shows that families that participate in home visiting programs, which focus on strengthening vulnerable families with children under age 5, are often more self-sufficient and better able to handle the challenge of parenting and to raise healthier, safer children.

However, for many reasons, including differences in family needs, culture, and the availability of supportive community services, past evidence of effectiveness alone does not necessarily lead to positive outcomes. Evidence must play an essential role throughout the life of the program, from legislation and planning to design and implementation. Ongoing performance monitoring is vital to understanding whether desired family and child outcomes are being realized. Several states have passed legislation to make home visiting programs more effective and accountable by requiring the agencies that oversee them to set goals and measure results.

Source: The PEW Charitable Trusts

Available at: http://www.pewtrusts.org/en/research-and-analysis/reports/2015/10/using-data-to-measure-performance-of-home-visiting

Development of a Measure of Family and Provider/Teacher Relationship Quality (FPTRQ), 2010-2015 

8/2015

The goal of the FPTRQ project is to develop new measures to assess the quality of the relationship between families and providers/teachers of early care and education for children birth to 5 years of age. The measures will examine this relationship from both the parent and the provider/teacher perspectives, and capture important elements of family-provider/teacher relationships such as attitudes of respect, commitment, and openness to change and practices such as bi-directional communication, sensitivity, and flexibility. The project aims to develop measures that are appropriate for use across different types of early care and education settings, including Head Start and Early Head Start programs, center-based child care, pre-k classrooms, and home-based child care. In addition, a high priority of the project is to make the new measures culturally appropriate for diverse populations, including lower-income and higher-income families, ethnically/racially diverse providers and families, and non-English speaking families and providers.

Tasks for the FPTRQ project include (1) reviewing literature on family and provider/teacher relationships; (2) developing a conceptual model of the key components of family-provider/teacher relationships that promote family engagement and lead to better family, child and provider outcomes; (3) reviewing existing measures; (4) consulting with experts in relevant fields on possible content and format of the measures; (5) holding focus groups with parents and providers/teachers, developing items, and piloting the measure; (6) development of final measures for extensive data collection in a variety of care settings; (7) psychometric and cognitive testing to ensure the soundness of the measures; (8) the development of a sustainability plan regarding training on the measures and production of future editions of the measures as needed; and (9) developing, conducting cognitive testing, and pilot testing measures to assess the relationship quality between Family Service Staff and parents in Head Start/Early Head Start.

Source:  Office of Planning, Research & Evaluation, Administration for Children and Families

Available at: http://www.acf.hhs.gov/programs/opre/research/project/development-of-a-measure-of-family-and-provider-teacher-relationship-quality-fptrq

Child Outcomes Summary (COS) Process Module: Collecting & Using Data to Improve Programs

8/2015

This online learning module provides key information about the COS process, and the practices that contribute to consistent and meaningful COS decision-making. Over the course of multiple sessions, participants will learn about:

  • why child outcomes data are collected;
  • the key features of the COS process;
  • the essential knowledge needed to complete the COS process;
  • how the three child outcomes are measured through the process;
  • how to identify accurate COS ratings using a team-based process;
  • the importance of comparing children’s current functional performance to age-expected functioning;
  • when and how to measure progress in the three child outcome areas; and
  • how to document ratings and evidence to support those ratings in COS documentation.

Please use the link below to register for the module. You will be automatically redirected to the module after registering. The module is self-paced, so you may access it as often as desired.

Source: ECTA Center and DaSy Center

Available at: http://dasycenter.org/child-outcomes-summary-cos-process-module-collecting-using-data-to-improve-programs/

Hazard Mapping Instructions for Grantees

8/2015

Hazard mapping is a process that Head Start programs can use after an injury occurs. It helps to: 1) identify location(s) for high risk of injury; 2) pinpoint systems and services that need to be strengthened; 3) develop a corrective action plan; and 4) incorporate safety and injury prevention into ongoing monitoring activities. Hazard mapping is employed effectively in emergency preparedness planning related to natural disasters. It also is used to isolate locations of disease outbreaks and determine where prevention efforts are most needed.

Source: Early Childhood Learning and Knowledge Center, National Center on Health

Available at: http://eclkc.ohs.acf.hhs.gov/hslc/tta-system/health/safety-injury-prevention/safe-healthy-environments

Using Data for Family and Program Progress

7/2015

Learn how to use data more effectively to strengthen your work with children and families. Use these resources to support family and program progress. They are designed for Head Start, Early Head Start, and other early care and education program staff. These resources are aligned with the PFCE Framework and Head Start Program Performance Standards.

Source: Early Childhood Learning and Knowledge Center, National Center on Parent, Family, and Community Engagement

Available at: http://eclkc.ohs.acf.hhs.gov/hslc/tta-system/family/assessing

Welcome to OHS Health Talks

7/2015

A Health Talk is a pre-recorded video or podcast that allows health managers to explore deeper into specific health topics.The Health Talks include two series:

  • Health Chats: Listen, as new tools and strategies are discussed to improve health outcomes for children.
  • Ask the Experts: Get answers to frequently asked questions from pediatricians, dentists, psychologists, and other health professionals.

Health Talks offer an easy way to learn more about some of the health issues that concern the early childhood community. Health professionals, technical assistance providers, and other early childhood health and safety staff share information on a variety of topics. The topics are chosen based on questions and suggestions submitted from the field. Send your suggestions for the next Ask the Expert or Health Chat presentations to nchinfo@aap.org.

What is a Health Chat

  • Digging Deeper into Safety and Injury Prevention Data
  • Using Stepping Stones and Compliance with Care to Support Infants and Toddlers
  • Identifying and Reporting Child Abuse and Neglect

Ask the Expert

  • What is Ask the Expert
  • Head Lice
  • Head Start and the Medical Home
  • Nurturing Health and Wellness in Early Childhood: Nurturing the Brain, the Environment, and the Nurturer

Source: Early Childhood Learning and Knowledge Center, National Center on Health

Available at: http://eclkc.ohs.acf.hhs.gov/hslc/tta-system/health/health-services-management/program-planning/health-talks.html