One of the main goals of creating and publishing the U.S. Education Innovation Index Prototype and Report was to stimulate evidence-based conversations about innovation in the education sector and push the field to consider more sophisticated tools, methods, and practices. Since its release three weeks ago at the Digital Promise Innovation Clusters convening in Providence, the index has been met with an overwhelmingly positive reception.
I’m grateful for the many fruitful one-on-one conversations that have pushed my thinking, raised interesting questions, and provoked new ideas.
Here are a few takeaways on the report itself:
People love radar charts. And I’m one of those people. In the case of the innovation index, radar charts were a logical choice for visualizing nine dimensions and a total score. Here they are again in all their glory.
Readers weren’t always clear on the intended audience or purpose. This concern came up often and hit close to home as someone who strives to produce work that is trusted, relevant, timely, and useful. One of the benefits of the prototype is that we can test the tool’s utility before expanding the scope of the project to more cities or an even more complicated theoretical framework. So far the primary audience for the index — funders, policy makers, superintendents, education leaders, and city leaders — have demonstrated interest in learning more about the thinking behind the index and how it can be applied to their work. Ultimately I hope it will influence high-stakes funding, policy, and strategic decisions.
The multidimensionality of innovation challenges assumptions. When I explain that we measured the innovativeness of education sectors in four cities — New Orleans, San Francisco, Indianapolis, and Kansas City, MO — inevitably, the next question I get is “how do they rank?” Instead of answering, I ask my conversant for his/her rankings. I’ve had this exchange dozens of times, and in almost every case, New Orleans topped the list because of the unique charter school environment. When I then explain that the index was sector agnostic — it doesn’t give preference to charter, district, or private schools — people immediately reconsider and put San Francisco in the number one slot. What this tells me is that many people associate innovation with one approach rather than treating it as the multidimensional concept that it is. This misperception has real policy and practice implications, and I hope the index provides nuance to the thinking of decision makers.
“Dynamism” and “district deviance” are intriguing but need more research. Two of the measures that I’m most excited about are also ones that have invited scrutiny and criticism: dynamism and district deviance. Dynamism is the entry and exit of schools, nonprofits, and businesses from a city’s education landscape. Too much dynamism can destabilize communities and economies. Too little can keep underperforming organizations operating at the expense of new and potentially better ones. In the private sector, healthy turnover rates are between five and 20 percent, depending on the industry. We don’t know what that number is for education yet, but it’s likely on the low end of the range. More research is needed. Our district deviance measure assumes that districts that spend their money differently compared to their peers and are trying new things, which is good. It’s a novel approach, but its accuracy is vulnerable if the assumptions don’t pan out. Again more research is needed.
Measure more cities! Everyone wants to see more cities measured with the index for one of two reasons. The first is that they want to know how their city is doing on our nine dimensions. The second is that they want to compare cities to each other. Both make my heart sing. Knowing how a specific city measures up is the first step to improving it. Knowing how it compares to others is the first step to facilitate knowledge transfer and innovation diffusion.