Tag Archives: Smarter Balanced

States Need to Get Real on Testing Tradeoffs Before Making Another Big Switch

risksignJust a few years ago, it seemed like most of the country was heading towards common state assessments in math and reading. Two groups of states won federal grant funds to create higher-quality tests; these became the PARCC and Smarter Balanced test consortia. Now, despite the demonstrated rigor and academic quality of those tests, the testing landscape is almost as fractured as it was before, with states pursuing a variety of assessment strategies. Some states in the consortia are still waffling. Others that have left are already scrapping the tests they made on their own with no idea of what they’ll do next.

States should think carefully before going it alone or introducing a new testing overhaul without strong justification. There are some big tradeoffs at play in the testing world, and a state might spend millions on an “innovative” new test from an eager-to-please vendor only to find that it has the same, or worse, issues as the “next generation” tests they tossed aside.

Continue reading

Do New Common Core Test Results Tell Us Anything New?

What do new assessments aligned to the Common Core tell us? Not all much more than what we already knew. There are large and persistent achievement gaps. Not enough students score at high levels. Students who performed well on tests in the past continue to perform well today. In short, while the new assessments may re-introduce these conversations in certain places, we’re not seeing dramatically different storylines.

To see how scores differ in the Common Core era, I collected school-level data from Maine. I chose Maine because they’re a small state with a manageable number of schools, they were one of the 18 states using the new Smarter Balanced test this year, and because they have already made data available at the school level from tests given in the spring of 2015.

The graph below compares average math and reading proficiency rates over two time periods. The horizontal axis plots average proficiency rates from 2012-14 on Maine’s old assessments, while the vertical axis corresponds to average proficiency rates in Spring 2015 on the new Smarter Balanced assessments.* There are 447 dots, each representing one Maine public school with sufficient data in all four years. The solid black line represents the linear relationship between the two time periods.

ME proficiency rates_2012-15

(Click graph to enlarge)

There are a couple things to note about the graph. First is that, as has played out in many other places, proficiency rates fell. The average proficiency rate for these schools fell from 64 to 42 percent. While a number of schools saw average proficiency rates from 2012-14 in the 80s and even the 90s, no school scored above 82 percent this year (this shows up as white space at the top of the graph).

Second, there’s a strong linear relationship between the two sets of scores. The correlation between these time periods was .71, a fairly strong relationship. Schools that did well in the past also tended to do well, on a relative basis, in 2015.

So what does all this mean? A few thoughts: Continue reading