For those who have been following my JLF environmental newsletter, the recent headline in the Charlotte Observer stating “Charlotte heading to a clean-air milestone” will come as no news at all. As we have been pointing out week after week since July in our “2013 Ozone Report,” the state of North Carolina has had only one code orange day this entire summer, and it was registered on one monitor in the Triad. So it is not just Charlotte that is having a record “clean air” season; it is the entire state.

Referencing comments from North Carolina Division of Air Quality (DAQ) spokesman Tom Mather, the article points out that the primary reason for such a good year is favorable weather. As usual, though, DAQ always feels compelled to reference North Carolina’s Clean Smokestacks Bill (CSB) passed in 2002 as a contributing factor. This response from DAQ, while predictable, is not supported by data comparisons with surrounding states that never adopted such a law. If the CSB was even partially responsible for low ozone in NC, then we should have better results than our neighbors. In fact, we do not.

Unfortunately, the author of the article, Bruce Henderson, also didn’t stick strictly to the facts, feeling compelled to interject his own assertions about the health consequences of this year’s numbers. Without any supporting data or even the kind of anecdotes that typically accompany such news reports, the author asserts definitively that, “A summer of cleaner air meant easier breathing for the thousands of people with asthma and other respiratory conditions.”  While this statement may or may not be true, it is an empirical assertion that is in fact at odds with comparative asthma and ozone trends. For decades, incidences of asthma in the state have been trending upward as ozone levels have been declining.

Also, Henderson cites April’s “2013” American Lung Association ozone rankings of cities across the country implying that it related to last year’s ozone season. In fact, it does not. ALA’s rankings are never based on the latest data. In fact, in April their rankings were based on a 3 year period ranging from 2009 to 2011. so this means it at best is looking at data that is 2 years old and at worst it includes data that is 4 years old. The 2012 data will not show up in ALA rankings until April of 2014, and this year’s performance will not be evidenced in the ALA report until April of 2015.