I recently referred to the EVAAS formulas used to calculate North Carolina's school growth and teacher effectiveness ratings as secret. Turns out I'm behind the times.
The Cary-based software company SAS, which created the formulas and markets them across the country, initially kept the specifics a proprietary secret. That's probably why Charlotte-Mecklenburg Schools officials have voiced wariness about having teachers' careers and school reputations depend on a formula they can't review.
It's because of such concerns that SAS released the formulas, which have been tested by groups such as RAND Corp. and UNC Chapel Hill, says Jennifer Preston of the N.C. Department of Public Instruction.
But that doesn't mean most educators, citizens and journalists can run the numbers themselves. I'm comfortable with Excel spreadsheets, education data and basic calculations. But when I see lines like "KTb + MTu is BLUP of KT + MT provided KT is estimable," I'm out.
Schools are labeled as meeting, exceeding or falling short of growth targets based on how their students did compared with projections. For many, 2013 growth ratings provided a counterpoint to the bleak picture painted by low proficiency rates on new exams. In 2014, proficiency and growth will combine to create a state-issued letter grade for all public schools. For charter schools, growth ratings are a key factor in determining whether a low-scoring school stays open.
There are, of course, people who say no formula can turn student test scores into meaningful measures of school quality and teacher effectiveness. But given that our state legislators and many national policymakers believe otherwise, it's important to be able to check the validity of those ratings.
Anyone who works with data, even on a much simpler scale, knows how easy it is to make a mistake -- and for that mistake to be compounded as you run it through further calculations. I've caught plenty of errors (my own and those of institutions I cover) by seeing that numbers don't jibe with what I know of reality.
It worries me that such crucial numbers aren't subject to an obvious "smell test." But Preston said the state is building in backstops. For starters, teachers get a chance to review the roster of students being used in their ratings, to make sure they're getting credit or blame for the right kids. Schools and districts review the raw data before it's sent to SAS. And the state has been reviewing dozens of questions that came in after the release of ratings, Preston said.
Preston, a former high school teacher, says the real value of EVAAS numbers comes from teachers who use student data to craft teaching strategies and principals who use them to make good use of their faculty. She said her numbers showed she was helping low-scoring students make big gains, while the students who came in strong stayed flat. Her principal assigned her to a low-performing class the next year, while a teacher who got better gains from higher-level students took that group. "We were both teaching to our strong points," she said.