Tag Archives: High school accountability

What Good Are Higher Graduation Rates If Students Aren’t Learning More?

On Thursday, the National Assessment of Educational Progress (NAEP) released the results of its 2015 science assessment for America’s 4th, 8th, and 12th grade students. Only 22 percent of 12th graders scored at or above the proficient level, compared to 38 percent of 4th graders and 34 percent of 8th graders. And while 4th and 8th graders both saw a small but significant improvement from 2009, high school seniors stagnated — earning the same average score as the 2009 sample.

This was also true across all subgroups. Among students of colors, students with disabilities, English language learners (ELLs), rural students, and female students, not a single group saw a statistically significant score change from 2009.2015 NAEP Science Assessment Scores

We saw a similar trend in April, when NAEP released the 12th grade results of its 2015 reading and math assessments. Seniors’ average reading score did not significantly change — again across every single subgroup. The average 12th grade math score declined.

And yet, earlier this month, data released by the U.S. Department of Education (ED) showed that America’s high school graduation rate has reached a record high of 83 percent, continuing a five year trend. In stark contrast with this year’s NAEP data, rates among students of color, students with disabilities, ELLs, and low-income students have all improved.

While this is certainly good news, it begs the question: What good are higher graduation rates if students aren’t learning more?

According to ED Secretary John King: “Students who have a high school diploma do better in the 21st Century economy than students who don’t. So having a higher graduation rate is meaningful progress.” While high school graduates do earn more than non-graduates, this answer is still deeply unsatisfying.

States will have the opportunity to seriously address America’s stagnant high schools in the coming years. The Every Student Succeeds Act (ESSA), signed into law last December, provides greater flexibility for states in almost every facet of federal K-12 education policy. The law makes it easier for states to spend Title I money on high school students. It also gives states much greater leeway for using school improvement funds, including an optional set-aside for programs like Advanced Placement, International Baccalaureate, and career and technical education. It remains to be seen exactly how states will implement the law, but luckily we’ll have NAEP along the way to give us a national snapshot of student learning.

We’re doing a better job of shepherding students to high school completion — now we just need to make sure they actually learn something.

Should Massachusetts PARCC the MCAS? Plus 5 Questions for Other States.

A recent Mathematica study found that new PARCC assessments were statistically no better at predicting which students were ready for college than Massachusetts’ old student assessments (called MCAS). Both tests were slightly superior to the SAT at identifying which students would be successful in college-level courses, but the study should prod all states’ thinking in a few ways:

1. Should we keep trying to build a better mousetrap? If $186 million and five years’ work at PARCC* can’t design something much better than what Massachusetts developed on its own in 1993, why continue this race?

2. If states pick a random test from the cheapest assessment vendor, will they get results even as good as what we’re seeing here? The study also found that cut scores matter. Although the two tests produced results that were statistically indistinguishable, the PARCC cut score did send a stronger signal than the cut score Massachusetts had been using on MCAS. To examine how that’s playing out for their students, all states should be doing studies like this one–the other federally funded assessment consortia, Smarter Balanced, should be as well. I suspect the results would be no matter or perhaps worse than what Mathematica found in Massachusetts. different. If even the highest-quality standards and the best tests we have at the K-12 level don’t tell us that much about readiness for college, what chance do individual states have of coming up with something better?

3. Related to #2, should states still be coming up with their own high school achievement tests? Why don’t more states opt to use the SAT or ACT** as their high school accountability tests? This study found that PARCC and MCAS had slightly more predictive value than the SAT, but there are trade-offs. The SAT and the ACT are older, shorter, and cheaper than what states typically offer, plus they’re familiar to parents and, unlike a given state’s assessment, SAT and ACT scores are accepted by colleges and universities all across the country. The ACT and SAT have proven themselves to be useful, if slightly flawed measures of college-readiness. Why do states think they can do better?

4. How much should we value alignment between K-12 academic standards and tests? One objection to just using the SAT or ACT for official state purposes is that they’re not aligned to each state’s academic content standards. But so what? Both PARCC and MCAS are closely aligned to Massachusetts’ state academic standards, but neither one is all that closely aligned to actual student outcomes at Massachusetts colleges and universities.

5. If there’s only a moderate relationship between high school test scores and first-year college GPA (let alone longer-term GPA or college completion rates), why do we keep our sole reliance on these tests for accountability purposes? I happen to have a whole paper on this topic, but this study is yet another reminder that if states care about college-readiness, they need to be tracking actual college outcomes, not just test scores.

*Disclosure: The PARCC consortia is a Bellwether client, but I am not involved in the work.

**ACT is a former Bellwether client. It was not part of the Mathematica study but it has very similar correlations to first-year college outcomes.