July 5, 2010

Click here to view and here to listen to Terry Stoops discussing this Spotlight report.

RALEIGH — A newly published survey of North Carolina university faculty members raises concerns about misleading and poorly constructed questions on two end-of-course tests used in North Carolina’s public schools.

The John Locke Foundation’s latest Spotlight report highlights the survey’s results and recommends testing changes. “The N.C. Department of Public Instruction should not be in the testing business,” said report author Terry Stoops, JLF Director of Education Studies. “Now that the department has released some tests — for the first time — we can begin to assess potential problems linked to poorly designed test questions.”

Survey results reveal real concerns, Stoops said. “Most university economists responding to the survey objected to every test question submitted for their review,” he said. “In addition, a majority of political scientists raised objections to answers provided for two of the six test questions they reviewed.”

The best solution to bad test questions is for state education officials to get rid of their end-of-grade and end-of-year tests, Stoops said. “Instead of North Carolina’s questionable tests, students should take an independent, field-tested and credible national test of student performance.”

Even if state education officials refuse to scrap North Carolina’s tests, they could improve the current system, Stoops said. “If North Carolina is going to keep its tests, DPI should create a test question review board consisting of college and university faculty and subject-area experts from the private and public sector.”

First approved in 1996, tests associated with North Carolina public schools’ accountability program, the ABCs of Public Education, never were available for public inspection until 2009.

“For students, well-reasoned but incorrect answers to a handful of test questions could mean the difference between meeting and not meeting state proficiency standards,” Stoops said. “Public school teachers also complained that poorly constructed test questions undermined months of classroom instruction and weeks of test preparation. Until the release of the 2008-09 end-of-grade and end-of-course tests, it was impossible to substantiate or refute either concern.”

Now parents, teachers, subject-area experts, and outside observers can subject test questions to their own tests, Stoops said. “That’s the motivating force behind our new survey.”

End-of-course tests in civics and economics and U.S. history appeared to provide the most opportunities for subjective responses and multiple valid interpretations, Stoops said. “Once the end-of-course tests in those subject areas became available, scholars in economics and political science compiled a list of questions with no definite correct answer or answers subject to multiple interpretations.”

No one claims that the selected questions are representative of all end-of-course test questions, Stoops said. “While suggestive, a handful of flawed test questions are not sufficient to condemn the entire ABCs of Public Education,” he said. “Yet one must question how these questions survived DPI’s 22-step, four-year test development process.”

Once scholars had chosen six questions related to economics and another six linked to political science, Stoops drafted two different surveys and submitted them to more than 500 economics and political science professors at public and private colleges and universities in North Carolina. About 13 percent of the economics professors and 8.5 percent of political science professors responded.

Stoops identified some factors that lowered the participation rate. “Disagreement with the John Locke Foundation’s mission appeared to play a significant role, as some professors returned unopened survey materials and others actively discouraged colleagues from participating,” he said. “One even returned a survey with the comment, ‘Stop your ideological attack on public education!'”

“It’s interesting that scholars whose jobs depend on the process of open inquiry would turn down an opportunity to add to the pool of knowledge about a testing program that has been shrouded in secrecy,” Stoops added.

Those who did respond to the survey generated interesting results, Stoops said. “In general, economists surveyed were not pleased with any of the six questions they saw,” he said. “For example, one question asked students to choose which economic ‘concept’ a particular example demonstrated. One economist wrote that the state’s preferred answer is not an economic ‘concept’ at all. A handful of respondents agreed that the best answer wasn’t even included among the options.”

Most political scientists responding to the survey did not object to four of the six questions they reviewed. A vast majority did object to answers provided for the other two questions. “In fact, no respondents believed DPI provided a correct answer for a question about civic responsibility,” Stoops said. “One respondent said the question confused legal requirements with civic responsibilities.”

More scrutiny would improve the current testing system, Stoops said. “The release of the 2008-09 state tests is a good start,” he said. “DPI should continue to conduct a more transparent and accountable testing program, including an online data tool that allows users to analyze test questions based on student responses.”

Terry Stoops’ Spotlight report, “Survey of End-of-Course Test Questions: Many college and university faculty are concerned about the quality of state standardized tests,” is available at the JLF Web site. For more information, please contact Stoops at (919) 828-3876 or [email protected]. To arrange an interview, contact Mitch Kokai at (919) 306-8736 or [email protected].