Understanding ERB’s Updated CTP Norms

,

By Glenn Milewski and Adrienne Hu

Introduction

At ERB, our goal is to help schools see the potential in every student.  We believe a great way to do this is by offering assessments at various points of a student’s career and enabling educators to explore and extract insight from assessment data using 360 Access.  

One way our members understand student learning is by comparing their assessment results to ERB norms.  The purpose of norms is to describe the test performance of a reference group of test-takers. Norms are useful only insofar as the data sets used to compute them represent the population of interest and are large enough to yield precise statistics. ERB computes norms with large data sets that are updated every year.  We provide norms for more than 50 school groupings, including independent schools, private, and faith-based associations, a nationally representative group of public and private schools, and others.  

When the COVID-19 pandemic and resulting global shutdown led to significantly reduced testing volumes, we “froze” our norms—meaning, we postponed our normal practice of updating norms with the most recently available data.  We are excited to report that we “unfroze” our norms starting with the CTP administration that began on March 7, 2022.

What Happened to ERB Testing During the Pandemic?

Each spring, schools typically administer over one million CTP subtests to more than 250,000 students around the world. However, in the spring of 2020, the number of CTP subtests administered dropped to a remarkable low of 50,000 as schools shifted focus to prioritize safety, remote instruction, and student well-being.

When schools reopened, testing steadily resumed as schools were eager to understand the impact of the pandemic on student learning. This spring, ERB projects that testing volume will reflect at least 80% of pre-pandemic levels.

Figure 1: Impact of the pandemic and global shutdown on ERB testing volume.

How We Updated Norms and What to Expect

Our standard practice at ERB is to report three-year “rolling norms.” This means we typically update norms each season using data from the past three seasons of testing. For the spring of 2022, however, updated CTP norms will be based on four years of spring data, from 2018 through 2021.

Although the updated norms are based on four years of CTP data, including two years of pre-pandemic and two years of pandemic data, the total number of cases from the two pandemic years only slightly exceeds the number of cases from any single pre-pandemic year. Therefore, the norms remain more heavily weighted toward pre-pandemic performance. It is also important to point out that although the independent school norms and various association norms will be updated, the suburban and national norms will not. There simply was not enough new data to update the suburban norms, and the national norms are based on a separate data collection and research effort.

What is the impact of these updates?

The performance of students who tested during the pandemic was, on average, lower than that of students who tested before it. For example, mean scores in the spring of 2021 were 7 points lower on Reading Comprehension and Mathematics than those in the spring of 2019. For Verbal and Quantitative Reasoning tests, scores were 10 points lower. These findings echo those of Rochon and Shuman (2021) who found that among matched students evaluated over time, performance growth during the pandemic was less for reasoning tests than for traditional achievement tests. The differences above are considered small1. However, there were more significant differences between 2021 and 2019 for certain tests, levels, and associations, especially when these groups had low CTP volume in 2021.

The lower performance of students testing during the pandemic is mostly absorbed when pre-pandemic and pandemic data are combined through ERB’s rolling norms methodology. For example, mean scores across the above tests were lower by only 4 points in the updated norm group (based on the last four years of data) than the previously used “frozen” norm group.

Figure 2 illustrates the impact of lower performance on percentiles. In Figure 2, the two curved lines show the relationship between test scores and percentiles for Level 8 of the Mathematics test. In the updated norms, a student with a score of 750 will have a percentile of 70. In the frozen norms, a student with this same score would have had a percentile of 67. This is due to the slightly lower performance of the updated norm group. As is shown in Figure 2, percentile differences are greater in the middle of the score scale than at the lower or upper tails of the scale score range.

It is important to point out that these differences make the updated norms more valid than the frozen ones. As characteristics of a reference group change, it’s important that norms reflect these changes. Now that there is sufficient data to update norms to reflect these changes, ERB can provide more meaningful information to aid in score interpretation.

Figure 2: A comparison of frozen versus updated norms for the CTP Mathematics Level 8 test.

Summary

The global shutdown resulting from the COVID-19 pandemic had a dramatic impact on ERB testing.  In response, ERB “froze” CTP norms.  As testing has resumed, ERB is updating norms to reflect a mix of pre-pandemic and pandemic data.  The performance of the updated norm group is slightly lower than the frozen norm group, but this lower performance is a  more precise reflection of current learning achievement, reflecting the new normal of life during the pandemic.


Tagged in:

1 In the social sciences, effect sizes are sometimes described in terms of changes relative to a standard deviation.  The lower mean scores observed in 2021 represent about 10% of a standard deviation across the various CTP scales, which is considered a small effect.

Related Reading

MyERB: Introducing the New ERB Member Experience

Launched in May 2024, MyERB makes it easy for members to access the information and tools they need for assessment administration, data interpretation, account management, event registration, and much more. […] read more

The Power of Evidence-Based Problem-Solving to Improve Student Outcomes

Dave Hersh, CEO of Character Lab, recommends educators use five steps to collect evidence for problem-solving and ultimately improving student outcomes. […] read more

Navigating the Middle East Conflict and Assessing Mission Success

As with any challenge to the value and integrity of schools, this is an opportunity to reaffirm our core commitments as educators. Teaching about the Middle East is part of the educational mission of schools […] read more

Spring Cleaning: Taking the Time to Reflect on the School Year

Spring is an ideal part of the year to sequester some good thinking time in order to reflect on what has worked well during the past year, along with what could have gone better. Be unsparing. Take notes. Then set it aside for at least the first half of the summer. […] read more

Become a member

An ERB membership unlocks access to our portfolio of assessments and measurement tools to better understand the whole child and enables you to become part of a community of like-minded educators.

Are you an ERB Member?

Update your email preferences to receive news and updates from ERB.

Not an ERB member? Join our global community today!
ERB and EMA are excited to announce their intent to merge.