November 10, 2021

Confused by your child’s state assessment results? You’re not alone.

By Michelle Croft

Share this article

As trained psychometricians, my husband and I study how to design student achievement tests and interpret the scores. And if that work wasn’t complicated enough, our son took his first statewide standardized assessment last spring. We thought we were well prepared to review his results, but we were wrong. When we received an email in mid-October from our school district on how to access his results, my husband said to me, “Now I understand why people complain about standardized tests.” 

The process to get our son’s test scores was not at all user friendly, and I can’t imagine that we’re the only parents experiencing this level of confusion as families like ours receive spring 2021 student assessment results.  

First, we had to log into the school’s student information system (e.g., Infinite Campus, PowerSchool) where we could view his scores, proficiency levels (e.g., advanced, proficient, and not proficient), and the number of questions answered correctly for different portions of the test. Because our son had tested in person, there was also a claim code so we could create a separate “Parent Portal” account from the test vendor. If he had tested remotely, the only information that we would have received would have been his scores in the district system. We were instructed to take the scores, open a technical manual that had been linked in the email, and use the manual to find our son’s percentile rank. There was no information provided on how to interpret any of the scores.*  

Although the ongoing COVID-19 pandemic is a likely factor causing confusion, our experience highlights problems with assessment information and transparency. Given calls to eliminate annual testing in schools, it’s increasingly important for states and districts to facilitate the appropriate use and understanding of the test scores so families can understand what these tests do and do not tell us about student learning. The first step is providing parents with information that’s not only timely, but also accessible. Here are a few common issues. 

Achievement Levels

To help with score interpretation, states are required to create at least three achievement levels. These achievement levels provide a rough indicator of whether or not a student is meeting grade level requirements. However, not much information is given to parents about what these levels actually mean. The descriptions within the score report often use jargon that is likely unfamiliar to parents. For instance, an advanced student in mathematics has “a thorough understanding of Operations and Algebraic Thinking.” To understand the meaning, parents would need to read the detailed performance level descriptors that are in a different manual or read their state’s standards. Another issue is that proficiency can vary from assessment to assessment, and parents are left trying to figure out why their child was designated “Some Risk” on one assessment versus “Proficient” on another. 

Raw Scores

Raw scores are the number of items that a student answered correctly. Sometimes assessments will report raw scores as a “subscore.” However, these numbers can be misleading without more context. For instance, if there were only four items for a particular subscore and a student missed two of the four, it could look like they were particularly weak in that area when the discrepancy may be an artifact of the test length.  

Changes in the Assessment

Depending on the testing program, the interpretation of this year’s test scores may be different than previous years and it’s important to communicate the what and why about those differences. For example, percentile ranks are typically based on students who took the assessment during the first test administration. They’re referred to as the norm group, which provides a relatively stable comparison over time. When interpreting the percentile rank, it’s essentially saying that a student at the 50th percentile scored better than 50% of the students in the norm group. Changes to the norm group can make a big difference in terms of the interpretation as we’re changing our reference point. In my state, the first administration of the test was in 2019 but the norm group was updated to students who tested in 2021.   

On the surface, this could be reasonable. Given disruptions in learning, families, teachers, and school leaders may want know how students compare to others who have had similar disruptions to their schooling. But, if a parent wants to know how much learning loss may have occurred and compare their child’s score to peers’ scores pre-pandemic, they’d need to either use the proficiency standards (advanced, proficient, not proficient, which are a fairly rough indicator given the range of scores), or break out the 2019 technical manual and look up their child’s percentile rank. 

These issues may sound minor, but they’re not. And, when poorly communicated, they reinforce the narrative that test scores aren’t useful or important and contribute to increased skepticism about testing. Although some of the shifts are unique to COVID-19, states also change tests, norm groups, and cut scores in non-pandemic times.  

Moving forward, increased transparency is needed to ensure that parents like my husband and me, districts, and policymakers better understand how to interpret and use the scores to track student growth. 

 

(*Our school district has a one-to-one device initiative and provides hotspots to families that don’t have internet access. In other districts, there may be substantial equity issues in distributing student scores through online platforms, as not all families have access to technology.)

More from this topic

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
ErrorHere