Tag Archives: Research

Deep into the new school year, we’re still missing a lot of students

An empty elementary school classroom

Source: Wikimedia

Educators, parents, and policymakers have been concerned about the effects of the pandemic on student learning ever since it forced the abrupt end of in-person instruction in March. In October, my colleagues and I estimated that 3 million students were at high risk of having had little to no education since then. NWEA, the organization that runs the popular MAP Growth exam, estimated in April 2020 that learning loss due to spring school closures and the “summer slide” would set students back, on average, by 30% of a year in reading and more than half a year in math.

The new school year has brought about new data on student performance, and the early returns seem less dire than those original projections — with a major caveat. In a new brief with fall data, NWEA found that students in their test sample started the 2020-21 school year in roughly the same place in reading compared with similar students at the start of 2019-20, and about 5-10 percentile points lower in math. This was a huge sample of 4.4 million students spanning grades 3 through 8, so relatively minor slowdowns in math progress seems worth celebrating.

But these findings are not all good news. The authors note that many of the observable declines were concentrated disproportionately among Black and Hispanic student populations. Biggest of all, fully 25% of students who took the MAP last year didn’t take it this year. In a “normal” year, that rate of dropoff is more like 15%, which suggests that there are many students missing from this year’s data. These could be new homeschoolers or private school enrollees, or they could be disconnected from the school system altogether.

This aligns with other early state-level estimates of enrollment declines. Connecticut’s fall 2020 enrollment is down roughly 3%; so is that of Washington and Missouri. Georgia’s state enrollment numbers are down 2.2%. Most of those declines are concentrated in kindergarten and pre-K, often in double digits. Each of these newly available data points seem to provide evidence of a big picture that is potentially devastating: as many as three million students missing from school.

It’s important to consider here that these missing students — missing from school, and missing from the NWEA MAP data — include those most likely to be deeply affected by the pandemic. In an addendum to the NWEA brief, authors Angela Johnson and Megan Kuhfeld warn that these new learning loss estimates must be considered with this in mind: that the students being tested now are on average less racially diverse (and whiter) and attending socioeconomically more advantaged schools. This is emblematic of what we have seen playing out across the country all year. Generally speaking, more well-off students and their families have the resources to withstand the pressure of the pandemic to an extent that their lower-income peers do not, resulting in two increasingly divergent education systems: one where frequent testing, hybrid learning, and private tutoring are available — and one where they are not.

While this challenge is immense and likely to be with us for some time to come, there are action steps policymakers can take immediately that will better position states and districts for the long haul. The new enrollment figures underscore an urgent need for improved attendance and enrollment data and faster reporting that will enable schools to be responsive and flexible in tracking down “missing” students. There is also a need for attendance intervention strategies that start with an informed understanding of students’ unmet needs, and for collaboration with social service organizations and other community-based organizations that can work to meet those needs. And states can start by providing the funding that can make these interventions possible.

For more on the 3 million students missing in the margins, you can read Bellwether’s report here.

Seriously, Stop Asking If Head Start “Works”

Last month, yet another study came out examining the effects of Head Start on children’s long-term outcomes. The findings were lackluster: Depending on the cohort of children and outcomes you’re looking at, the effect of Head Start was either negative or non-existent. 

This study is noteworthy for a few reasons. It uses the same analytical approach as a high-profile 2009 study on Head Start, conducted by Harvard economist David Deming, which found Head Start had unquestionably positive results. And in a twist I’m definitely reading too much into, a former Deming student is one of the lead co-authors on this new study. People are also paying attention to this study because the findings go against a truly massive body of evidence on Head Start, which largely shows that Head Start has positive effects on children and families. 

But what snagged my attention is the fact that the research question at the heart of this study is irritatingly useless. It asks, essentially, “Does Head Start work?” That’s a question we answered a long time ago. And the answer is: It depends.

Again, the existing research on Head Start overall is positive. But we also know that there is wide variation in quality between individual Head Start providers. It’s a valuable federal program that can get better.  Continue reading

Learning from a Missed Opportunity in Massachusetts

If current predictions hold, several states will either set new or stand by current limits on charter school growth and expansion. These limits, called charter school caps, place a ceiling on the number of charter schools or students those schools can enroll. In 2016, Massachusetts did the same thing: Voters rejected Ballot Question 2, which would have raised the cap on charter schools in the state. But research released just last week suggests that Massachusetts’ voters made a mistake. The states currently considering similar legislation should pay attention.

In the study I’m referencing, authors Sarah Cohodes, Elizabeth Setren, and Christopher R. Walters examined the effect of a policy that allowed effective charter schools in Boston to replicate their school models at new locations. They found that these new schools produced large achievement gains that were on par with those of their parent campuses. What’s more, the average effectiveness of charter middle schools in the city increased after the policy reform.

This evidence could, perhaps, be dismissed if the sector saw only a marginal increase in the number of schools; that is, if there were only a few additional charter schools that pulled this off. But that’s not the case: Boston’s charter sector produced these results despite a doubling of the charter market share in the city.

This analysis would be a big deal for any charter sector, but it is particularly meaningful for Boston. As Bellwether found in a recent analysis of the charter sector, Boston has the highest performing urban charter sector in the country. The average child who attended Boston charter schools benefited from basically a full year of additional learning compared to students in traditional public schools: 170 additional days of learning in reading and 233 days of learning in math. And the research suggests that Boston charter schools have strong, positive effects on the learning outcomes of students with disabilities and English-language learners, as well. The implication here is that not only did Boston’s charter schools replicate their impact, they replicated some of the most effective charter schools we’ve ever seen, to the benefit of the thousands of students in Boston who are on charter school waitlists.

The states that are poised to double down on charter caps — such as New York, Maine, and California — shouldn’t make the same mistake as Massachusetts did in 2016. New York, in particular, is at risk here: In our analysis earlier this year, we examined the existing evidence on New York and New York City and found that there, too, charters are more effective than traditional public schools. By committing to the cap, the state is refusing thousands of students the opportunity to attend high-quality schools.

To be sure, there are reasons to question the growth of a charter sector other than whether charters can replicate effectiveness across schools. Charter critics cite, for example, concerns about the effect of charter sector growth on traditional public school enrollment. But, particularly during National Charter Schools Week, states should be skeptical of arguments used to support charter school caps that claim charter schools cannot be replicated effectively.

Why Is There a Disconnect Between Research and Practice and What Can Be Done About It?

What characteristics of teacher candidates predict whether they’ll do well in the classroom? Do elementary school students benefit from accelerated math coursework? What does educational research tell us about the effects of homework?

three interconnected cogs, one says policy, one says practice, one says research

These are questions that I’ve heard over the past few years from educators who are interested in using research to inform practice, such as the attendees of researchED conferences. These questions suggest a demand for evidence-based policies and practice among educators. And yet, while the past twenty years have witnessed an explosion in federally funded education research and research products, data indicate that many educators are not aware of federal research resources intended to support evidence use in education, such as the Regional Education Laboratories or What Works Clearinghouse.

Despite a considerable federal investment in both education research and structures to support educators’ use of evidence, educators may be unaware of evidence that could be used to improve policy and practice. What might be behind this disconnect, and what can be done about it? While the recently released Institute of Education Sciences (IES) priorities focus on increasing research dissemination and use, their focus is mainly on producing and disseminating: the supply side of research. Continue reading

Expand Your Ed Policy Toolkit with Human-Centered Design

Design Methods for Education Policy Website

Design Methods for Education Policy Website

In February, I released a white paper making the case that policy professionals can create better education policies by using human-centered research methods because these methods are informed by the people whose lives will be most affected.

Yesterday, we released a companion website (https://designforedpolicy.org/) that curates 54 human-centered research methods well-suited to education policy into one easy-to-navigate resource. We took methods from organizations like IDEO, Stanford’s Hasso Plattner Institute of Design, and Nesta and organized them by the phases of a typical education policy project. We included brief explanations of how each method might be applied to your current work.

To be sure, you probably already use some human-centered design methods in your work, even if you don’t think of them that way. Interviews and observations are commonplace and provide highly valuable information. What the design world brings is a mindset that explicitly and deeply values the lived experiences of the people who are most impacted by problems and an array of methods to capture and analyze that information. It also adds a heavy dose of creativity to the process of identifying solutions. And despite a common misconception, when done well, human-centered design methods are very rigorous, fact-based, and structured to root out assumptions and biases.

When combined, common policy analysis methods and human-centered design methods can result in a powerful mix of quantitative and qualitative, deductive and inductive, macro and micro, rational and emotional elements. Continue reading