As my colleagues noted yesterday, Denver leaders are currently hostingconversations about their local school rating system, called the School Performance Framework (SPF), and deciding whether they will abandon this local system in favor of Colorado’s state rating system.
Districts around the country are facing similar choices this year — whether to build, adopt, or abandon a local rating system — as states roll out new report cards. The federal Every Students Succeeds Act (ESSA), passed in 2015, requires states to improve the way they rate schools. In response states created report cards with key performance data for every school in their state. But not all communities weresatisfied with their state ESSA report card.
Some districts created — and others are currently considering — localized school rating systems to fill in the gaps. These are an enormous opportunity for school districts, but one with many risks if districts do not heed the lessons of the past and pay attention to today’s context. In the case of Denver, it’s clear that local options must be built carefully in order to survive shifting political contexts.
ESSA report cards promised to include more impactful data than required by ESSA’s predecessor No Child Left Behind. Yet the truth is many state report cards are no better than what came before. An April 2019analysis by the Data Quality Campaign found that many state report cards still lack critical information — including the progress and growth of different student groups and students’ access to high-quality teachers — making it difficult for families and communities to understand if and how schools are serving their kids.
As school districts step in to create local versions of school report cards, the question is: will these local remedies provide a more complete picture of school quality or will they confuse parents and other stakeholders even more?
Families and communities need access to reliable, understandable information about school quality to make decisions for their students. One tool district leaders can use to provide this information is a school performance framework (SPF). But SPFs are only useful to families if they are designed with families in mind. If leaders treat the needs of families as an afterthought during the design phase, it should be no surprise when families don’t use the tool.
In our recent project at SchoolPerformanceFrameworks.org, my co-authors and I identified family and community information as one of three primary “use cases” that could shape SPF design decisions. My colleague Bonnie O’Keefe explains the concept of use cases and offers another example — school continuous improvement — here.
An SPF designed to show families and communities how schools are performing should include:
Early, authentic, and ongoing engagement of families and community members in the design process: District leaders should involve families from the beginning to understand what information they need or may already have. This can be accomplished through task forces, roundtables, or listening sessions, or by administering parent surveys. Leaders should be cautious not to engage only the most visible stakeholders, but instead should use various methods to engage families that will be most impacted by the SPF. Inauthentic engagement risks alienating key stakeholders and reinforcing harmful power dynamics.
The information families and community members most want to know: Families typically prefer a higher level of detail, a focus on outcomes, and a summative rating, because they are easier to understand. This contrasts significantly from the granular level of detail school leaders might need. If leaders create a tool that primarily serves families, the SPF might be less useful to school leaders or system leaders.
Results displayed in an understandable and accessible way: One reason families may struggle to understand school performance frameworks is when they are full of jargon. For example, parent advocacy organization Learning Heroes has found that someone could misread the phrase “School Climate” on a school report card to mean building temperature as opposed to the quality of school life. District leaders should present data to families that is free of jargon and available in high-quality multilingual translations.
Many of the districts profiled in our report have made improvements to their SPF over the years to make them more accessible to parents. For example, the School Quality Rating Policy (SQRP) in Chicago was not originally designed with families in mind, but the growth of school choice options prompted the district to make changes to give families access to more transparent, shared information across schools. SQRP reports now include the size of the school, the names and contact information for school leaders, programmatic offerings, and information about transportation options to each school. Reports are available in multiple languages and families can easily find the definitions of key terms within one click.
While an increasing number of cities have implemented school performance frameworks (SPFs), very little has been written about how these tools compare with one another.
SPFs provide information on school performance and quality across a variety of measures to numerous stakeholders, and New York, New Orleans, Denver, Chicago, and Washington, D.C. have all had their own version for, in some cases, more than five years.
Still, few resources exist for district leaders interested in SPF redesign or development. That’s where Bellwether’s newest project comes in.
My colleagues and I recently conducted in-depth case studies of four rural charter schools that outperform state and district averages in reading and math. We then published those case studies, and the lessons they surfaced, in a new website: ruralcharterschools.org.
Today, I have a new piece out in Education Post that profiles one school we visited, Crossroad Academy Charter School in Gadsden County, Florida. The piece explores the unique relationship between Crossroad Academy and a local HBCU, and explains how Crossroad students benefit from that relationship:
When Crossroad’s leader Kevin Forehand, an alum of Florida A&M and Gadsden County native, began his tenure as principal, the school’s student body was growing rapidly. As a result, the school needed a larger teaching force. Mr. Forehand recognized the importance of recruiting and hiring young, ambitious Black talent to teach at his school, and later developed a mentorship program between his alma mater and Crossroad Academy. Through this partnership, university staff and students help Crossroad high school students prepare for the college application process and review their application materials. In return, all Crossroad seniors apply to Florida A&M to ensure that they have at least one high-quality postsecondary option.
I have a new piece out in Education Week that focuses on teacher-designed assessments. In it I argue that while teacher designed assessments can be more beneficial to student learning than commercially prepared assessments, teacher survey data suggests that most teachers don’t feel they have the appropriate skills to design high-quality assessments:
National teacher polling data suggest that I was not alone. A 2016 Gallup poll found that roughly 30 percent of teachers do not feel prepared to develop assessments. Less than 50 percent of teachers in low-income schools reported feeling “very prepared” to interpret assessment results, and less than 50 percent of teachers said they’d received training on how to talk with parents, fellow teachers, and students about assessment results. More alarming is that no state requires teachers to be certified in the basics of assessment development, so it’s likely that many teachers have never had any formal assessment training.
I highlight work underway in New Hampshire and Michigan to make significant investments in assessment literacy training for educators. More states should follow the lead of these exemplars and commit to equipping all educators with the tools to develop high-quality, rigorous assessments.