Tag Archives: Massachusetts

Learning from a Missed Opportunity in Massachusetts

If current predictions hold, several states will either set new or stand by current limits on charter school growth and expansion. These limits, called charter school caps, place a ceiling on the number of charter schools or students those schools can enroll. In 2016, Massachusetts did the same thing: Voters rejected Ballot Question 2, which would have raised the cap on charter schools in the state. But research released just last week suggests that Massachusetts’ voters made a mistake. The states currently considering similar legislation should pay attention.

In the study I’m referencing, authors Sarah Cohodes, Elizabeth Setren, and Christopher R. Walters examined the effect of a policy that allowed effective charter schools in Boston to replicate their school models at new locations. They found that these new schools produced large achievement gains that were on par with those of their parent campuses. What’s more, the average effectiveness of charter middle schools in the city increased after the policy reform.

This evidence could, perhaps, be dismissed if the sector saw only a marginal increase in the number of schools; that is, if there were only a few additional charter schools that pulled this off. But that’s not the case: Boston’s charter sector produced these results despite a doubling of the charter market share in the city.

This analysis would be a big deal for any charter sector, but it is particularly meaningful for Boston. As Bellwether found in a recent analysis of the charter sector, Boston has the highest performing urban charter sector in the country. The average child who attended Boston charter schools benefited from basically a full year of additional learning compared to students in traditional public schools: 170 additional days of learning in reading and 233 days of learning in math. And the research suggests that Boston charter schools have strong, positive effects on the learning outcomes of students with disabilities and English-language learners, as well. The implication here is that not only did Boston’s charter schools replicate their impact, they replicated some of the most effective charter schools we’ve ever seen, to the benefit of the thousands of students in Boston who are on charter school waitlists.

The states that are poised to double down on charter caps — such as New York, Maine, and California — shouldn’t make the same mistake as Massachusetts did in 2016. New York, in particular, is at risk here: In our analysis earlier this year, we examined the existing evidence on New York and New York City and found that there, too, charters are more effective than traditional public schools. By committing to the cap, the state is refusing thousands of students the opportunity to attend high-quality schools.

To be sure, there are reasons to question the growth of a charter sector other than whether charters can replicate effectiveness across schools. Charter critics cite, for example, concerns about the effect of charter sector growth on traditional public school enrollment. But, particularly during National Charter Schools Week, states should be skeptical of arguments used to support charter school caps that claim charter schools cannot be replicated effectively.

Media: “Boston schools achievement gap remains wide along racial lines — a troubling sign” in Boston Herald

In February, Bellwether published “An Uneven Path: Student Achievement in Boston Public Schools 2007-2017.” Boston was in the midst of a leadership transition, and we advised the next superintendent to make tough choices in support of equity. Last week, the Boston School Committee chose Dr. Brenda Cassellius, former state superintendent in Minnesota, as the district’s next leader.

Chad Aldeman and I recently spoke to the Boston Herald about the findings in our report, and the challenges Dr. Cassellius will face in her new role:

“Black and Hispanic students have not been making enough progress,” said Chad Aldeman, senior associate partner at Bellwether Education Partners, a nonprofit that recently student achievement in Boston Public Schools, “It’s a troubling sign.”

BPS risks losing its status as a national leader in urban K-12 education if it doesn’t launch innovative strategies to address flattening testing scores, the experts added. “If they want Boston to continue to be a stronger-than-average district, they have to focus on black, Hispanic, and low-income students,” said Bonnie O’Keefe, an associate partner with Bellwether Education Partners.

Bellwether board member Paul Reville also weighed in on Boston’s achievement gaps:

“It’s clearly a major challenge for Boston moving forward,” said Harvard Graduate School of Education professor Paul Reville, a former Massachusetts secretary of education. “They still have a long way to go.”

For more detail on Boston Public Schools’ progress and performance in the past ten years, take a look at “An Uneven Path.” Or read the full Boston Herald piece here.

Should Massachusetts PARCC the MCAS? Plus 5 Questions for Other States.

A recent Mathematica study found that new PARCC assessments were statistically no better at predicting which students were ready for college than Massachusetts’ old student assessments (called MCAS). Both tests were slightly superior to the SAT at identifying which students would be successful in college-level courses, but the study should prod all states’ thinking in a few ways:

1. Should we keep trying to build a better mousetrap? If $186 million and five years’ work at PARCC* can’t design something much better than what Massachusetts developed on its own in 1993, why continue this race?

2. If states pick a random test from the cheapest assessment vendor, will they get results even as good as what we’re seeing here? The study also found that cut scores matter. Although the two tests produced results that were statistically indistinguishable, the PARCC cut score did send a stronger signal than the cut score Massachusetts had been using on MCAS. To examine how that’s playing out for their students, all states should be doing studies like this one–the other federally funded assessment consortia, Smarter Balanced, should be as well. I suspect the results would be no matter or perhaps worse than what Mathematica found in Massachusetts. different. If even the highest-quality standards and the best tests we have at the K-12 level don’t tell us that much about readiness for college, what chance do individual states have of coming up with something better?

3. Related to #2, should states still be coming up with their own high school achievement tests? Why don’t more states opt to use the SAT or ACT** as their high school accountability tests? This study found that PARCC and MCAS had slightly more predictive value than the SAT, but there are trade-offs. The SAT and the ACT are older, shorter, and cheaper than what states typically offer, plus they’re familiar to parents and, unlike a given state’s assessment, SAT and ACT scores are accepted by colleges and universities all across the country. The ACT and SAT have proven themselves to be useful, if slightly flawed measures of college-readiness. Why do states think they can do better?

4. How much should we value alignment between K-12 academic standards and tests? One objection to just using the SAT or ACT for official state purposes is that they’re not aligned to each state’s academic content standards. But so what? Both PARCC and MCAS are closely aligned to Massachusetts’ state academic standards, but neither one is all that closely aligned to actual student outcomes at Massachusetts colleges and universities.

5. If there’s only a moderate relationship between high school test scores and first-year college GPA (let alone longer-term GPA or college completion rates), why do we keep our sole reliance on these tests for accountability purposes? I happen to have a whole paper on this topic, but this study is yet another reminder that if states care about college-readiness, they need to be tracking actual college outcomes, not just test scores.

*Disclosure: The PARCC consortia is a Bellwether client, but I am not involved in the work.

**ACT is a former Bellwether client. It was not part of the Mathematica study but it has very similar correlations to first-year college outcomes.