Category Archives: Research

What This Washington Post Opinion Piece Got Wrong on Charter Schools

Over the weekend, the Washington Post Outlook section ran a frustrating cover story on charter schools that offered a narrow and biased picture of the charter sector and perpetuated a number of misconceptions.

Jack Schneider’s “School’s out: Charters were supposed to save public education. Why are Americans turning against them?” argues that the charter sector as a whole isn’t living up to its promises, leading public support for the schools to shrink. Schneider is correct that the charter school hasn’t lived up to all of its most enthusiastic boosters’ promises, but his piece flatly misrepresents data about charter quality. For example, Schneider writes that “average charter performance is roughly equivalent to that of traditional public schools.” This is simply inaccurate, as my colleagues indicated in a recent analysis of charter data and research (slide 37 here). The full body of currently available, high-quality research finds that charters outperform traditional public schools on average, with especially positive effects for historically underserved student groups (a recent Post editorial acknowledged this as well).

slide from Bellwether's "State of the Charter Sector" resource, summarizing research on charter sector performance

To be clear, research also shows that charter performance varies widely across schools, cities, and states — and too many schools are low-performing. Yet Schneider cherry picks examples that illustrate low points in the sector. He cites Ohio, whose performance struggles — and the poorly designed policies that led to them — Bellwether has previously written about. He also (inexplicably, given where his piece ran) overlooks Washington, D.C., where charters not only significantly outperform comparable district-run schools, but have also helped spur improvement systemwide. Over the past decade, public schools in D.C. (including both charters and DC Public Schools, DCPS) have improved twice as fast as those in any other state in the country, as measured by the National Assessment of Educational Progress (NAEP). DCPS was the nation’s fastest growing district in 4th grade math and among the fastest in 4th grade reading and 8th grade math. These gains can be partially attributed to the city’s changing demographics, but are also the result of reforms within DCPS — which the growth of charters created the political will to implement. Over the past decade, Washington, DC has also increased the number of high-performing charter schools while systematically slashing the number of students in the lowest-performing charter schools. When I served on the District of Columbia Public Charter School Board from 2009-2017, I had the chance to observe these exciting changes firsthand, so it was particularly disappointing to see a major feature in our city’s paper overlook them.

It’s frustrating that this biased and narrow picture drew prime real estate in one of the nation’s leading papers, because the charter sector does have real weaknesses and areas for improvement that would benefit from thoughtful dialogue. For example, as Schneider notes, transportation issues and lack of good information can prevent many families from accessing high-quality schools. In cities with high concentrations of charters, such as Washington, D.C. and New Orleans, there is a real need to better support parents in navigating what can feel like a very fragmented system. And despite progress in closing down low-performing charter schools, too many remain in operation. Schneider could have referenced the real work charter leaders are undertaking to address these lingering challenges (more on this in slide 112 of our deck).

Schneider is correct that public support for charters has waned in recent years, due in part to some of the challenges he references, but also because of orchestrated political opposition from established interests threatened by charter school growth. Given the increasingly polarized political environment around charter schools, the need for nuanced, balanced, and data-informed analysis and dialogue about them is greater than ever. Bellwether’s recent report on the state of the charter sector, and our past work on charter schools more broadly, seeks to provide that kind of analysis. Unfortunately, Schneider’s piece falls short on that score.

The Perry Preschoolers are All Grown Up and Their Experiences Continue to Guide the Field

If you’ve ever sat through a presentation on education research or early childhood education, you’ve likely heard of the Perry Preschool project. This seminal research study examined 123 preschool children in Ypsilanti, Michigan who were at risk for school failure. The kids were randomly divided into two groups: One group attended a high-quality preschool program and the comparison group received no preschool education. The participants were tracked throughout their lifetimes.

The widely studied long-term positive results of attending the preschool included higher rates of graduating high school, higher employment rates, higher earnings, and fewer teen pregnancies and criminal behavior. As one of the only randomized control trials in early childhood education, the Perry Project remains widely cited.

Even though fifty years have passed since the Perry Preschool program actively served children, the results still offer lessons for the early childhood education field.

Current discussions of early childhood interventions often focus on whether pre-K programs raise children’s readiness for kindergarten or their elementary school test scores. But new research from Nobel prize winning economist James Heckman and co-author Ganesh Karapakula — the first analysis of Perry Preschool participants through mid-life — illustrates the short-sightedness of this approach. Their report demonstrates that high-quality early childhood interventions can have a dramatic impact not only on program participants’ life outcomes but also the life outcomes of their future offspring. Some of their findings: Continue reading

Learning from a Missed Opportunity in Massachusetts

If current predictions hold, several states will either set new or stand by current limits on charter school growth and expansion. These limits, called charter school caps, place a ceiling on the number of charter schools or students those schools can enroll. In 2016, Massachusetts did the same thing: Voters rejected Ballot Question 2, which would have raised the cap on charter schools in the state. But research released just last week suggests that Massachusetts’ voters made a mistake. The states currently considering similar legislation should pay attention.

In the study I’m referencing, authors Sarah Cohodes, Elizabeth Setren, and Christopher R. Walters examined the effect of a policy that allowed effective charter schools in Boston to replicate their school models at new locations. They found that these new schools produced large achievement gains that were on par with those of their parent campuses. What’s more, the average effectiveness of charter middle schools in the city increased after the policy reform.

This evidence could, perhaps, be dismissed if the sector saw only a marginal increase in the number of schools; that is, if there were only a few additional charter schools that pulled this off. But that’s not the case: Boston’s charter sector produced these results despite a doubling of the charter market share in the city.

This analysis would be a big deal for any charter sector, but it is particularly meaningful for Boston. As Bellwether found in a recent analysis of the charter sector, Boston has the highest performing urban charter sector in the country. The average child who attended Boston charter schools benefited from basically a full year of additional learning compared to students in traditional public schools: 170 additional days of learning in reading and 233 days of learning in math. And the research suggests that Boston charter schools have strong, positive effects on the learning outcomes of students with disabilities and English-language learners, as well. The implication here is that not only did Boston’s charter schools replicate their impact, they replicated some of the most effective charter schools we’ve ever seen, to the benefit of the thousands of students in Boston who are on charter school waitlists.

The states that are poised to double down on charter caps — such as New York, Maine, and California — shouldn’t make the same mistake as Massachusetts did in 2016. New York, in particular, is at risk here: In our analysis earlier this year, we examined the existing evidence on New York and New York City and found that there, too, charters are more effective than traditional public schools. By committing to the cap, the state is refusing thousands of students the opportunity to attend high-quality schools.

To be sure, there are reasons to question the growth of a charter sector other than whether charters can replicate effectiveness across schools. Charter critics cite, for example, concerns about the effect of charter sector growth on traditional public school enrollment. But, particularly during National Charter Schools Week, states should be skeptical of arguments used to support charter school caps that claim charter schools cannot be replicated effectively.

Why Some Educators Are Skeptical of Engaging in Rigorous Research — And What Can Be Done About It

In my previous post, I talked about the importance of rigorous research and the need for researchers to engage directly with education stakeholders. Yet some educators remain skeptical about the value of partnering with researchers, even if the research is relevant and rigorous. Why might education agencies fail to see the value of conducting rigorous research in their own settings?

For one thing, letting a researcher into the nitty gritty of your outcomes or practices might reveal that something isn’t working. And since it’s rare that educators/practitioners and researchers are even in the same room, education agency staff may be concerned about how findings will be framed once publicized. If they don’t even know one another, how can we expect researchers and educators to overcome their lack of trust and work together effectively?

Furthermore, engaging with researchers takes time and a shift in focus for staff in educational agencies, who are often stretched to capacity with compliance and accountability work. Additionally, education stakeholders may have strong preferences for certain programs or policies, and thus fail to see the importance of assessing whether these are truly yielding measurable improvements in outcomes. Finally, staff at educational agencies may need to devote time to help researchers translate findings, since researchers are not accustomed to creating summaries of research that are accessible to a broad audience.

Given all this, why am I still optimistic about connecting research, practice, and policy? Continue reading

Why Is There a Disconnect Between Research and Practice and What Can Be Done About It?

What characteristics of teacher candidates predict whether they’ll do well in the classroom? Do elementary school students benefit from accelerated math coursework? What does educational research tell us about the effects of homework?

three interconnected cogs, one says policy, one says practice, one says research

These are questions that I’ve heard over the past few years from educators who are interested in using research to inform practice, such as the attendees of researchED conferences. These questions suggest a demand for evidence-based policies and practice among educators. And yet, while the past twenty years have witnessed an explosion in federally funded education research and research products, data indicate that many educators are not aware of federal research resources intended to support evidence use in education, such as the Regional Education Laboratories or What Works Clearinghouse.

Despite a considerable federal investment in both education research and structures to support educators’ use of evidence, educators may be unaware of evidence that could be used to improve policy and practice. What might be behind this disconnect, and what can be done about it? While the recently released Institute of Education Sciences (IES) priorities focus on increasing research dissemination and use, their focus is mainly on producing and disseminating: the supply side of research. Continue reading