Technical Problems Shouldn’t Spoil the Value of Good Tests

Following a technical problem that disrupted student testing in New Jersey, the Asbury Park Press editorialized that officials “have no real answers” for problem and that the “fairness” of the tests is in doubt. “In true PARCC fashion everyone scrambled to blame someone else…It doesn’t help the cause when official continue to duck responsibility even as schools across the state are disrupted by PARCC technical errors.”

In fact, New Jersey and PARCC officials deserve credit for immediately identifying and resolving the problem, which resulted from a minor Pearson employee error. Early Wednesday representatives from Pearson acknowledged the issue and pledged to do “everything possible to get the system back up.” And indeed, by Wednesday afternoon, officials had corrected the glitch—which stemmed from a log-on problem—ensuring testing would resume today. “It is important to us that these tests be administered in a way that eliminates avoidable disruptions and inconveniences,” said Laura Slover, CEO of PARCC.

State education commissioner David Hepse pointed out it was the first major glitch New Jersey schools encountered since they began administering PARCC assessments. The problem did not affect students in other states.

Testing problems are not unique to New Jersey. This year Alaska, which does not use PARCC assessments, experienced repeated technical problems. State officials later cancelled assessments for the year, meaning tens of thousands of families would not receive information about their children’s readiness.

The Asbury Park Press’ insinuation that the technical problem should raise concerns about the fairness or the quality of PARCC tests do not add up. Independent analyses have repeatedly applauded the value of the assessments.

A two-year study by the Thomas B. Fordham study found PARCC assessments earned an “Excellent or Good Match” to the content of Common Core State Standards, and outperformed the ACT Aspire and MCAS tests.

Writing about the Fordham study, Mike Petrilli and Chester Finn explained, tests like PARCC “are harder to game” and “actually deserve to guide classroom instruction rather than be condemned for mindless ‘test prep.’”

Other research confirms these findings. A study by the National Network of State Teachers of the Year (NNSTOY) found PARCC assessments better reflect the range of reading and math skills all students should master; closely align with classroom instruction; and are grade-level appropriate.

Pam Reilly, a participant in the NNSTOY research, wrote, “[Consortia tests] are a step in the right direction to begin improving student outcomes…I can say with confidence these new assessments are the kind we should want our kids to take.”