View in your browser.

Welcome

Last week, the NC Department of Public Instruction released state test scores, and they dropped across the board.  In this week’s CommenTerry, I examine two possible explanations for the drop — 1) changes in cut scores and 2) adoption of more rigorous standards and tests.

Bulletin Board

  • Attend. A list of upcoming events sponsored by the John Locke Foundation can be found at the bottom of this newsletter, as well as here.  We look forward to seeing you!
  • Donate. If you find this newsletter mildly informative or entertaining, please consider making a tax-deductible contribution to the John Locke Foundation.  The John Locke Foundation is a 501(c)3 not-for-profit organization that does not accept nor seek government funding. It relies solely on the generous support of individuals, corporations, small businesses, and foundations.
  • Share. The North Carolina History Project seeks contributors to the North Carolina History Project Encyclopedia. Please contact Dr. Troy Kickler for additional information.
  • Discuss. I would like to invite all readers to submit brief announcements, personal insights, anecdotes, concerns, and observations about the state of education in North Carolina.  I will publish selected submissions in future editions of the newsletter. Requests for anonymity will be honored. For additional information or to send a submission, email Terry at [email protected].
  • Revisit. We have archived all research newsletters on our website.  Access the archive here.

CommenTerry

The debate over state test scores has become pretty confusing for the average North Carolinian.  It is easy to understand why.  The issue involves new standards and curriculum, unfamiliar terminology, and a lack of straightforward explanations from those in charge, the NC Department of Public Instruction (NC DPI). Indeed, the publication designed to explain the changes to the public, "2013 READY Accountability Background Brief (PDF)," is six dense pages long.  I will do my best to sort it out.

Why did the scores drop?

The primary reason why the scores dropped was that the state changed the "cut scores."  ETS researchers Michael Zieky & Marianne Perie explain,

Cut scores are selected points on the score scale of a test. The points are used to determine whether a particular test score is sufficient for some purpose. For example, student performance on a test may be classified into one of several categories such as basic, proficient, or advanced on the basis of cut scores.

The NC State Board of Education approved new cut scores in October and their changes were reflected in the test scores released this month.

Raising cut scores means that students must answer more questions correctly to reach a certain achievement level.  In North Carolina, those achievement levels range from Level 1 to Level 4.  The state classifies students who answer enough questions to reach Level 3 as "proficient."

For example, the 2013 general math exam for third-grade included 44 test items.  In order to reach Level 3, i.e., "proficient," a student must have answered at least 30 questions (68 percent) correctly.  By comparison, in 2011 the third-grade math test had 50 questions but only required students to answer a minimum of 24 questions (48 percent) correctly to reach Level 3.  Obviously, it was much tougher to be proficient on a 2013 test than a 2011 one.  The question of why state education officials kept cut scores so low in the past is a relevant question, but it is one that I will not discuss here.

We also know that the achievement levels are similar to those found on more rigorous and reputable tests of student achievement, such as the National Assessment of Educational Progress (NAEP).  According to state test results, 47.6% of North Carolina fourth-graders and 34.2% of eighth graders are proficient in math.  Similarly, 43.7% and 41.0% of North Carolina’s fourth- and eighth-grade students, respectively, were proficient in reading.

The NAEP math results from the same school year yielded remarkably similar percentages to those reported by the state.  Specifically, 45 percent of fourth-graders and 36 percent of eighth-graders were proficient in math, a difference of approximately two percentage points compared to state tests.  In reading, the state scores were approximately eight percentage points above the NAEP scores.  Thirty-five percent of fourth-grade and 33 percent of eighth grade students in North Carolina were proficient in reading.

Are standards and tests more difficult?

We know that the cut scores were raised.  We also know that the proficiency levels are similar to those reported by the trustworthy NAEP tests. The role of standards and test questions is not as straightforward. 

State education officials insist that test scores dropped, in part, because Common Core math and reading standards were more demanding than previous state standards.  They contend that test questions reflected the increased rigor of the Common Core standards.  The implication is that test takers would have a more difficult time determining the correct answer because the questions are harder.

Unlike cut scores, the quality and rigor of standards and test questions is difficult to quantify.  Certainly, test questions are field tested and "validated" by NC DPI staff and their contractors.  But, as we have seen throughout the years, this process is not an exact science.  Indeed, North Carolinians have had many reasons to question the quality of the state testing program, which has been subject to every tweak and twerk imaginable during its 18-year history.

Even if we granted that the standards and test questions were more challenging than past ones, there are no assurances that they will stay that way.  Given their penchant for tinkering, NC DPI may purposefully or unintentionally alter the relative difficulty of the tests, however one would measure such things.

Where does that leave us?

I believe that it would be a mistake to support the state testing program simply because 1) the state finally raised the cut scores, and 2) the results are similar to those of more reputable testing programs, such as the NAEP.  On the other hand, it is also a mistake to dismiss the results as irrelevant because 1) we lack evidence that the standards and tests are more rigorous, and 2) the state testing program has a poor track record.

In sum, state education officials have given North Carolinians legitimate reasons to love and hate these tests.

Facts and Stats

As usual, the mainstream media has ignored the performance of charter schools on state tests.  I suspect that they have done so because charters generally outperform their district counterparts on multiple measures of student achievement.

According to the composite reading and math scores for students in grades 3-8, as published by the NC Department of Public Instruction,

  • 85.1 percent of charters met or exceeded growth versus 71.4 percent of districts (HT: North Carolina Public Charter Schools Association);
  • 39.7 percent of charter school students in grades 3-8 were proficient in math and reading, while only 32.0 percent of district school students earned proficiency (Note: The charter school percentage includes alternative and first-year schools.);
  • Three charter schools had ridiculous test scores for their elementary and middle school students.  Metrolina Regional Scholars Academy in Charlotte (90.7 percent), Quest Academy Charter School in Raleigh (88.0 percent), and Magellan Charter School in Raleigh (85.1 percent) rounded out the top three;
  • Triangle Math and Science Academy had an impressive start.  The 2012-13 school year was their first.  Nevertheless, they had 63.8 percent of their students at or above proficient on math and reading tests; and
  • Charter schools, on average, had a proficiency percentage that was 35.3 percent lower than the previous year.  This was consistent with the statewide drop of 35.5 percent.

Please note: My purpose here was simply to provide a bird’s-eye comparison of charter and district performance.  As such, I did not account for differences in the student populations, school size, or any of the other relevant variables.

Education Acronym of the Week

NAEP — National Assessment of Educational Progress

Quote of the Week

"Students today are expected to solve problems and to use knowledge in new ways. We have raised standards for students because we want them to be ready for anything they choose to do after high school. That means doing more to prepare them for the competitive challenges of college and careers."

State Superintendent June Atkinson, quoted in an NC DPI press release

Click here for the Education Update archive.