Building a School Performance Framework for School Improvement? Lessons From New York City.

Dr. Melissa Morris, Principal of Shaw Heights Elementary School.

Dr. Melissa Morris, Principal of Shaw Heights Elementary School. Photo Source: Shaw Air Force Base.

It’s easy to say a school performance framework (SPF) will be useful to school leaders, but it’s another thing entirely to actually design an SPF with school continuous improvement as the primary purpose. In fact, some of the design features that might make an SPF more useful to principals on a day-to-day basis might mean the SPF is less useful for families and system leaders. 

In our recent project available at SchoolPerformanceFrameworks.org, my co-authors and I identified school-level continuous improvement as one of three primary “use cases” that could shape SPF design decisions. A “use case” is a concept borrowed from the world of technology and design meant to help designers (in this case, local education leaders) think through their users’ needs. The other two use cases for SPFs are “system management and accountability” and “family and community information.” We found that too many SPFs try to fulfill multiple uses at once, without clearly thinking through priorities and potential tradeoffs.

An SPF designed for school continuous improvement is meant for principals, school academic leaders, and the front-line district staff supporting them to shape school operations, culture, staffing, and student experience — all in service of better student outcomes. An SPF built primarily for this purpose should allow principals to understand how they are performing relative to district goals, diagnose key strengths and weaknesses, and flag any potential problems. What would an SPF for this purpose include in terms of design?

  • Detailed, frequently updated data about students, teachers, and school culture: An SPF designed for school leaders should include a wide variety of data — like student academic performance and growth, attendance, re-enrollment, school culture, staff turnover, and student discipline rates. School leaders should be able to track their school’s progress against internal goals and external benchmarks. Metrics should be available frequently and dynamically: waiting a year or more for results is rarely useful in the course of running a school. 
  • Early and often engagement of school leaders and educators in the design process: The SPF should involve school leaders and educators early in the design process, to understand what data and tools they use frequently, and what might be missing. The format and presentation of information in the SPF should be geared towards school leaders’ priorities, but aligned to district goals.
  • Training on how to interpret and use data effectively: School leaders will need support to understand the SPF and incorporate it into their planning and decision processes successfully. Training, resources, and clear expectations for principals and their leadership teams will help ensure a clear connection between the SPF and regular decision cycles, and that school leaders actually put the information and tools to use as they make decisions around curriculum, staffing, or new initiatives. 

Among the five long-standing SPFs we examined closely for this report, only New York City prioritized school-level continuous improvement in its goals and design. NYC’s school quality reports offer a detailed and dynamic look at an array of student outcomes, with several ways to benchmark progress against district goals and trends in other schools. It also includes detailed qualitative information from on-site reviews and student surveys. NYC does not assign schools a single rating, but instead offers seven ratings on different academic and non-academic metric categories. 

However, for this purpose, the tool falls short on timely data — most data is only reported on an annual basis, and on-site reviews may be several years old for many schools. Also, in search of comprehensive measures, NYC might offer too many metrics for principals or any other user to efficiently understand and address.   

School continuous improvement is the most difficult use case to combine with other uses. The detail and frequency of data that principals need is less useful for a system leader or a parent making decisions on an annual basis. System leaders and families may be better served by a tool that clearly focuses on key student outcomes, allows for easy cross-school comparison, and provides the additional clarity of a single school rating.

Even if an SPF is not designed with school continuous improvement as its top priority, school leaders may still find an SPF useful. An SPF that focuses on system management or family needs could help leaders set goal posts for their more detailed continuous improvement plans. In this case, additional principal data resources could help translate the SPF into school-level action. 

To learn more about other use cases for SPF design, and other long-standing local SPFs, visit SchoolPerformanceFrameworks.org. And stay tuned for more posts exploring the other “use cases” for SPFs.