Quantcast
728 x 90
728 x 90
728 x 90
728 x 90
728 x 90

PARCC’s Plummet

Despite valiant efforts, I was unable to find a nice, neat, uplifting Friday education story for us to talk about today. That’s kind of a good thing, though. Pressing issues like the Jeffco recall-oisseurs’ inability to tell the truth have distracted us from a large education policy discussion backlog. Today we’re going to nibble on that backlog by taking a look at the latest developments for the tortured PARCC test.

Faithful readers will recall that my policy friends Ben DeGrow and Ross Izard published a joint op-ed late last legislative session calling on Colorado’s policymakers to reach a compromise on the testing issue—and to seriously reevaluate the state’s use of the much-maligned PARCC exam. The testing compromise happened (and little else), but Colorado remains in the PARCC testing consortium for now.

Meanwhile, PARCC seems to be entering a death spiral of sorts. Ohio recently cut its ties with the test, and the Arkansas State Board of Education voted to do the same yesterday.  Arkansas will now use the ACT Aspire. New York, which has kinda sorta flirted with PARCC without officially taking the plunge, also declined to extend its contract with PARCC creator Pearson for testing in grades 3-8. New York’s new Common Core-aligned tests in the affected grades will now be designed by a different company called Questar.

By my count, the PARCC consortium now covers only nine states, one of which is Colorado, and the District of Columbia. According to an Education Week blog piece by Peter Greene, PARCC fewer than five million students across the United States will be taking PARCC next year, and that number may get even lower before all is said and done. Considering that a major draw of the test was to allow apples-to-apples comparisons of yearly student test results across state boundaries, this is obviously a big problem.

There have been a lot of criticisms leveled at the PARCC test, some more valid than others. PARCC does not, for instance, ask students how many guns they have in their homes. However, last year’s administration of the test did cause a logistical nightmare due to being administered in two separate testing windows and requiring access to a computer for every tested student. It also was accused of eating into instructional time (this undoubtedly occurred, though not to the massive extent that some have claimed). Finally, the test is slow on its feet; teachers don’t receive actionable feedback quickly enough to put it to good use.

To its credit, the PARCC consortium did eventually respond to these complaints by altering its approach in terms of test length and administration windows. Colorado’s testing compromise bill also required that paper-and-pencil options be provided to alleviate some of the strain of cycling hundreds or thousands of students through limited computer equipment.

Even so, it looks like the improvements may have been too little, too late. The PARCC name has been likely been irreparably damaged. And as the number of PARCC states dwindles, it’s hard to imagine how the struggling test will reverse course. As Peter Greene puts it:

There’s no longer any question that PARCC’s golden days are long behind it. It will be interesting just how small the test can get before Pearson decides that the much-unloved not-so-mega-test is no longer worth their corporate time, trouble, and investment.

As I frequently remind you, measurement and associated accountability systems are critical components of successful education reform. But how you measure matters, and it seems like it may soon be time for Colorado to start more closely looking at that question. The testing compromise created a (tentative) way forward with a proposed testing pilot program that still awaits federal approval, but I strongly suspect that more immediate answers will be demanded in the coming year. We shall see.

In the meantime, have a great weekend! See you next week.