While reporting a recent article on Sugar Creek Charter School's plans to add high school, I was dismayed to see the test-score reporting by UNC Charlotte's Urban Education Collaborative. A 37-page report from the collaborative, which is part of the College of Education, bases its claims for "extraordinary outcomes in public education" on the school's proficiency gains between 2008 and 2012.
![]() |
Source: SchoolwiseCharlotte.org |
Yet nowhere do these researchers, who are part of a partnership with Sugar Creek known as Schoolwise, explain that scores also plunged statewide in 2008, when North Carolina introduced a tougher reading exam. And that they rose sharply in 2009, when the state started giving students a second chance to pass. The curve depicted for Sugar Creek is common to most N.C. schools -- again, with the biggest plunge-and-rise among schools serving kids who traditionally struggle to reach grade level.
I've called Charlotte-Mecklenburg Schools out on playing the same game in the past. I shudder to think how many national experts believe schools and programs across our state are successful based on big gains since 2008. Charts like this are a great marketing tool, if not exactly a testament to integrity in reporting.
When testing changes, year-to-year comparisons carry little meaning. At that point, the best bet is to see how a given school, district or group of schools compares with similar students. As I noted in my article, such comparisons indicate Sugar Creek is doing well compared with state and CMS averages, though the latest numbers are unlikely to inspire breathless praise. (For careful readers, the numbers I cite, from N.C. school report cards, represent the percent of students who passed both reading and math tests; that's different from the composite score based on reading, math and science exams. Both are legitimate ways to measure proficiency.)
CMS used to do a good job of this when officials evaluated programs such as strategic staffing. The studies were sometimes buried online, but they existed. Unfortunately, a reader recently pointed out to me that the CMS research link, which I'd kept in the rail at the right of this blog, is now dead. If there's a new one I can't find it.
![]() |
Lewis |
Chance Lewis, director of the Urban Education Collaborative, says he's working on just such a comparison for Sugar Creek, which is the collaborative's partner. He and I agree that the challenge is figuring out the fairest comparison for the charter school, which serves grades K-8. Do you look at CMS neighborhood schools or at magnets? Focus only on other K-8 schools, or on elementaries and middle schools? Do categories such as "African American" and "economically disadvantaged" give a true apples-to-apples look?
Results for 2014 are due out later this summer. We already know they'll be up, because the state changed the scoring system to allow more students to pass. With all the uncertainty about Common Core, it's hard to tell what we'll get in coming years. Here's my forecast: By 2017, we're going to see lots of charts showing that schools have made amazing gains since 2013.