Tag Archives: Research

Seriously, Stop Asking If Head Start “Works”

Last month, yet another study came out examining the effects of Head Start on children’s long-term outcomes. The findings were lackluster: Depending on the cohort of children and outcomes you’re looking at, the effect of Head Start was either negative or non-existent. 

This study is noteworthy for a few reasons. It uses the same analytical approach as a high-profile 2009 study on Head Start, conducted by Harvard economist David Deming, which found Head Start had unquestionably positive results. And in a twist I’m definitely reading too much into, a former Deming student is one of the lead co-authors on this new study. People are also paying attention to this study because the findings go against a truly massive body of evidence on Head Start, which largely shows that Head Start has positive effects on children and families. 

But what snagged my attention is the fact that the research question at the heart of this study is irritatingly useless. It asks, essentially, “Does Head Start work?” That’s a question we answered a long time ago. And the answer is: It depends.

Again, the existing research on Head Start overall is positive. But we also know that there is wide variation in quality between individual Head Start providers. It’s a valuable federal program that can get better.  Continue reading

Learning from a Missed Opportunity in Massachusetts

If current predictions hold, several states will either set new or stand by current limits on charter school growth and expansion. These limits, called charter school caps, place a ceiling on the number of charter schools or students those schools can enroll. In 2016, Massachusetts did the same thing: Voters rejected Ballot Question 2, which would have raised the cap on charter schools in the state. But research released just last week suggests that Massachusetts’ voters made a mistake. The states currently considering similar legislation should pay attention.

In the study I’m referencing, authors Sarah Cohodes, Elizabeth Setren, and Christopher R. Walters examined the effect of a policy that allowed effective charter schools in Boston to replicate their school models at new locations. They found that these new schools produced large achievement gains that were on par with those of their parent campuses. What’s more, the average effectiveness of charter middle schools in the city increased after the policy reform.

This evidence could, perhaps, be dismissed if the sector saw only a marginal increase in the number of schools; that is, if there were only a few additional charter schools that pulled this off. But that’s not the case: Boston’s charter sector produced these results despite a doubling of the charter market share in the city.

This analysis would be a big deal for any charter sector, but it is particularly meaningful for Boston. As Bellwether found in a recent analysis of the charter sector, Boston has the highest performing urban charter sector in the country. The average child who attended Boston charter schools benefited from basically a full year of additional learning compared to students in traditional public schools: 170 additional days of learning in reading and 233 days of learning in math. And the research suggests that Boston charter schools have strong, positive effects on the learning outcomes of students with disabilities and English-language learners, as well. The implication here is that not only did Boston’s charter schools replicate their impact, they replicated some of the most effective charter schools we’ve ever seen, to the benefit of the thousands of students in Boston who are on charter school waitlists.

The states that are poised to double down on charter caps — such as New York, Maine, and California — shouldn’t make the same mistake as Massachusetts did in 2016. New York, in particular, is at risk here: In our analysis earlier this year, we examined the existing evidence on New York and New York City and found that there, too, charters are more effective than traditional public schools. By committing to the cap, the state is refusing thousands of students the opportunity to attend high-quality schools.

To be sure, there are reasons to question the growth of a charter sector other than whether charters can replicate effectiveness across schools. Charter critics cite, for example, concerns about the effect of charter sector growth on traditional public school enrollment. But, particularly during National Charter Schools Week, states should be skeptical of arguments used to support charter school caps that claim charter schools cannot be replicated effectively.

Why Is There a Disconnect Between Research and Practice and What Can Be Done About It?

What characteristics of teacher candidates predict whether they’ll do well in the classroom? Do elementary school students benefit from accelerated math coursework? What does educational research tell us about the effects of homework?

three interconnected cogs, one says policy, one says practice, one says research

These are questions that I’ve heard over the past few years from educators who are interested in using research to inform practice, such as the attendees of researchED conferences. These questions suggest a demand for evidence-based policies and practice among educators. And yet, while the past twenty years have witnessed an explosion in federally funded education research and research products, data indicate that many educators are not aware of federal research resources intended to support evidence use in education, such as the Regional Education Laboratories or What Works Clearinghouse.

Despite a considerable federal investment in both education research and structures to support educators’ use of evidence, educators may be unaware of evidence that could be used to improve policy and practice. What might be behind this disconnect, and what can be done about it? While the recently released Institute of Education Sciences (IES) priorities focus on increasing research dissemination and use, their focus is mainly on producing and disseminating: the supply side of research. Continue reading

Expand Your Ed Policy Toolkit with Human-Centered Design

Design Methods for Education Policy Website

Design Methods for Education Policy Website

In February, I released a white paper making the case that policy professionals can create better education policies by using human-centered research methods because these methods are informed by the people whose lives will be most affected.

Yesterday, we released a companion website (https://designforedpolicy.org/) that curates 54 human-centered research methods well-suited to education policy into one easy-to-navigate resource. We took methods from organizations like IDEO, Stanford’s Hasso Plattner Institute of Design, and Nesta and organized them by the phases of a typical education policy project. We included brief explanations of how each method might be applied to your current work.

To be sure, you probably already use some human-centered design methods in your work, even if you don’t think of them that way. Interviews and observations are commonplace and provide highly valuable information. What the design world brings is a mindset that explicitly and deeply values the lived experiences of the people who are most impacted by problems and an array of methods to capture and analyze that information. It also adds a heavy dose of creativity to the process of identifying solutions. And despite a common misconception, when done well, human-centered design methods are very rigorous, fact-based, and structured to root out assumptions and biases.

When combined, common policy analysis methods and human-centered design methods can result in a powerful mix of quantitative and qualitative, deductive and inductive, macro and micro, rational and emotional elements. Continue reading

Education Policy, Meet Human-Centered Design

In a lot of ways, the worlds of education policy and human-centered design couldn’t be more dissimilar. The former relies heavily on large-scale quantitative analysis and involves a long, complex public process. The latter is deeply qualitative, fast moving, creative, and generative. Policy professionals come up through the ranks in public agencies, campaigns, and think tanks. Deep issue expertise and sophisticated deductive reasoning are highly valued. Designers come from an array of backgrounds — the more unorthodox the better. Success for them comes from risk-taking, novel ideas, and synthesizing concepts across time, space, and sectors.

figure from Creating More Effective, Efficient, and Equitable Education Policies with Human-Centered Design comparing policy and design methods

figure from Creating More Effective, Efficient, and Equitable Education Policies with Human-Centered Design

I’m fortunate to have spent some time in both worlds. They each appeal to different parts of my personality. Policy analysis affords me order and confidence in answers based on facts. Design lets me flex my creative muscles, fail fearlessly, and have confidence in answers based on experience.

So when a grant from the Carnegie Corporation of New York gave me the opportunity to write a paper about bringing these two worlds together, I jumped at the chance — I knew that each could benefit from the other.

Creating More Effective, Efficient, and Equitable Education Policies with Human-Centered Design makes the case that policy practitioners can use human-centered methods to create better education policies because they are informed by the people whose lives will be most affected by them.

The underpinning hypothesis is that 1) co-designing policies with constituents can generate more accurate definitions of problems and more relevant solutions, 2) human-centered design can generate a wider variety of potential solutions leading to innovation, and 3) the process can mitigate or reverse constituent disenfranchisement with the lawmaking process.

Human-centered policy design is still a new practice, however, and there are still important questions to work out, like how to make sure the process is inclusive and where exactly human-centered design methods can enhance policy research and design.

Luckily, SXSW EDU, a huge national conference focused on innovation in education, is a perfect place to test new ideas. So I reached out to Maggie Powers, director of STEAM Innovation at Agnes Irwin School and member of IDEO’s Teachers Guild, and Matt Williams, vice president of Education at Goodwill of Central Texas, to explore what it would look like to apply human-centered design to policies that affect high school students whose education suffers because of lost credits when they transfer schools. Our session will pressure test some of the ideas that emerged in the paper. The results will inform the next phase of this work, which will help policy practitioners implement human-centered design methods. Keep an ear to the ground for that!