Showing posts with label teacher effectiveness. Show all posts
Showing posts with label teacher effectiveness. Show all posts

Friday, February 21, 2014

Teacher ratings: Best and worst in Charlotte region

The teacher effectiveness ratings released this week provide rich material for analysis and debate. I just got the spreadsheet from the N.C. Department of Public Instruction, and I'll be poring through it to see what trends emerge.

But I couldn't resist a quick search to see which schools fall at the top and the bottom for our area. First I looked for those with the highest percentage of teachers who exceeded the state's goals for student growth on state exams. I eliminated those with fewer than 20 teachers, where percentages are so dramatically swayed by one or two individuals. That shuts out most elementary schools, because state testing starts in third grade and only fourth- and fifth-grade teachers have students with a previous year's scores to base growth projections on.

Here's what came up:

1. Weddington Middle (Union County), 77.8 percent of 45 teachers.
2. Highland School of Technology (Gaston County magnet), 73.9 percent of 23 teachers.
3. South Point High (Gaston County), 68.9 percent of 45 teachers.

Porter Ridge High
4. Ridge Road Middle (CMS), 60.9 percent of 46 teachers.
5. Marvin Ridge High (Union County), 60.5 percent of 38 teachers.
6. Mount Holly Middle (Gaston County), 59.3 percent of 27 teachers.
7. Lake Norman High (Iredell-Statesville), 53.6 percent of 56 teachers.
8-9. Winkler Middle (Cabarrus County), 52.5 percent of 40 teachers.
8-9. Porter Ridge High (Union County), 52.5 percent of 40 teachers.
10. South Charlotte Middle (CMS), 51.4 percent of 35 teachers.

Statewide, 23 percent of teachers exceeded the target.

I also sorted for schools with the highest percentage of teachers who failed to meet the growth target. Again eliminating schools with fewer than 20 teachers, they are:

1. Hopewell High (CMS), 56.8 percent of 44 teachers.
2. Vance High (CMS), 54 percent of 50 teachers.
3. Harding High (CMS), 53.2 percent of 47 teachers.
4. Statesville High (Iredell-Statesville), 51.5 percent of 33 teachers.
5. North Meck High (CMS), 50 percent of 44 teachers.
6. West Meck High (CMS), 48.4 percent of 62 teachers.
7. Friday Middle (Gaston County), 47.8 percent of 23 teachers.
8. Grier Middle (Gaston County), 46.2 percent of 26 teachers.
9. Hunter Huss High (Gaston County), 45.9 percent of 37 teachers.
10. Independence High (CMS), 44.3 percent of 61 teachers.

Twenty-one percent of all N.C. teachers fell short of the growth target.

My search included district and charter schools in Mecklenburg, Union, Cabarrus, Iredell, Catawba, Lincoln and Gaston counties.

I'm intrigued by these numbers, but I want to be clear that this is not a definitive picture of academic quality at these schools.  It's worth noting that all schools on the "worst"  list had teachers with top ratings,  and most on the  "best"  list had teachers who fell short.  There's still plenty of room for debate on whether the value-added formula can really turn student test scores into a meaningful measure of how good a teacher is.  But these ratings are shaping education decisions and teachers' careers,  so they're worth exploring.

Monday, December 16, 2013

Academic growth formula: Not secret, just complex

I recently referred to the EVAAS formulas used to calculate North Carolina's school growth and teacher effectiveness ratings as secret. Turns out I'm behind the times.

The Cary-based software company SAS,  which created the formulas and markets them across the country,  initially kept the specifics a proprietary secret.  That's probably why Charlotte-Mecklenburg Schools officials have voiced wariness about having teachers' careers and school reputations depend on a formula they can't review.

It's because of such concerns that SAS released the formulas,  which have been tested by groups such as RAND Corp. and UNC Chapel Hill,  says Jennifer Preston of the N.C. Department of Public Instruction.

But that doesn't mean most educators, citizens and journalists can run the numbers themselves. I'm comfortable with Excel spreadsheets, education data and basic calculations.  But when I see lines like  "KTb + MTu is BLUP of KT + MT provided KT is estimable,"  I'm out.

The calculations turn each student's performance on prior exams into a prediction about how they'll do on the next ones. The actual score is compared with the projection.  Teachers'  "value-added" ratings compare their students' progress to that of other teachers across the state. Those ratings form part of the state's teacher evaluation;  persistent low ratings jeopardize a teacher's job,  while strong ratings may someday lead to performance pay.

Schools are labeled as meeting, exceeding or falling short of growth targets based on how their students did compared with projections.  For many,  2013 growth ratings provided a counterpoint to the bleak picture painted by low proficiency rates on new exams.  In 2014, proficiency and growth will combine to create a state-issued letter grade for all public schools.  For charter schools,  growth ratings are a key factor in determining whether a low-scoring school stays open.

There are,  of course,  people who say no formula can turn student test scores into meaningful measures of school quality and teacher effectiveness.  But given that our state legislators and many national policymakers believe otherwise,  it's important to be able to check the validity of those ratings.

Anyone who works with data,  even on a much simpler scale,  knows how easy it is to make a mistake -- and for that mistake to be compounded as you run it through further calculations.  I've caught plenty of errors  (my own and those of institutions I cover)  by seeing that numbers don't jibe with what I know of reality.

It worries me that such crucial numbers aren't subject to an obvious  "smell test."  But Preston said the state is building in backstops.  For starters,  teachers get a chance to review the roster of students being used in their ratings,  to make sure they're getting credit or blame for the right kids.  Schools and districts review the raw data before it's sent to SAS.  And the state has been reviewing dozens of questions that came in after the release of ratings,  Preston said.

Preston,  a former high school teacher,  says the real value of EVAAS numbers comes from teachers who use student data to craft teaching strategies and principals who use them to make good use of their faculty.  She said her numbers showed she was helping low-scoring students make big gains,  while the students who came in strong stayed flat.  Her principal assigned her to a low-performing class the next year,  while a teacher who got better gains from higher-level students took that group.  "We were both teaching to our strong points,"  she said.