As my colleagues noted yesterday, Denver leaders are currently hosting conversations about their local school rating system, called the School Performance Framework (SPF), and deciding whether they will abandon this local system in favor of Colorado’s state rating system.
Districts around the country are facing similar choices this year — whether to build, adopt, or abandon a local rating system — as states roll out new report cards. The federal Every Students Succeeds Act (ESSA), passed in 2015, requires states to improve the way they rate schools. In response states created report cards with key performance data for every school in their state. But not all communities were satisfied with their state ESSA report card.
Some districts created — and others are currently considering — localized school rating systems to fill in the gaps. These are an enormous opportunity for school districts, but one with many risks if districts do not heed the lessons of the past and pay attention to today’s context. In the case of Denver, it’s clear that local options must be built carefully in order to survive shifting political contexts.
ESSA report cards promised to include more impactful data than required by ESSA’s predecessor No Child Left Behind. Yet the truth is many state report cards are no better than what came before. An April 2019 analysis by the Data Quality Campaign found that many state report cards still lack critical information — including the progress and growth of different student groups and students’ access to high-quality teachers — making it difficult for families and communities to understand if and how schools are serving their kids.
As school districts step in to create local versions of school report cards, the question is: will these local remedies provide a more complete picture of school quality or will they confuse parents and other stakeholders even more?
The answer: it depends.
If district leaders design SPFs to reflect local goals and priorities for their students and avoid duplicating the purposes of the state report card, then yes, local SPFs can provide a clearer, more nuanced picture of school quality and be a valuable tool for communication about a school district’s unique goals. However, if district leaders don’t assess the existing state system to avoid redundancy, they may end up confusing parents with competing information on a community’s schools.
In Bellwether’s recent website highlighting lessons from five cities that have implemented SPFs, we found that some districts differentiate their local SPFs from state ratings by developing an entirely new system with metrics aligned to local goals. For example, Chicago Public Schools prioritizes student growth by using NWEA MAP test results and assessing student progress during and between school years. While the state system also measures growth, it uses a test that provides less actionable information to schools and teachers.
Other districts have adapted the state system to meet local needs instead of building an entirely new tool. They may be motivated by a limited amount of time or resources to devote to building a brand new SPF. For example, in New Orleans, the Orleans Parish School Board learned that families and community members understood the existing state system and found it to be a reliable source of information about school quality. The Board still relies on the state system, but places additional weight on student growth when making charter school renewal and closure decisions to counteract the state’s narrow emphasis on student achievement.
District leaders frustrated with a lack of nuance or depth in state report cards often view local SPFs as an opportunity to clearly define what matters when evaluating school quality. But they need to remember that SPFs should make it easier for audiences to understand school performance. If they conflict with existing state systems, these new tools may be underused and offer confusing messages.