Friday, July 16, 2010

Test scores coming: Time to cheer?

The summer storm of school data is starting. Wake County Public Schools released their results from 2010 state exams this week, and Charlotte-Mecklenburg will follow on Monday. Statewide results will come over the next couple of weeks.

It's safe to predict some celebrating in Charlotte, based on Wake's results and the big fat hints that local parents, faculty and officials have been dropping since the district scored the tests in June.

It's also safe to predict controversy over what the scores mean. I'm hearing buzz about some great results in high schools, driven partly by a new state requirement that students who fall just short of passing the first time try again (they take a different version of the same test). In Wake, the biggest gains came from retesting. Some, including CMS Superintendent Peter Gorman, argue that's a bogus gain.

There's also valid debate about how well test scores (or any other data) gauge real education, let alone identify who's responsible for success or failure. Barring miracles -- and certain cranky reporters are always skeptical of "miraculous" gains -- there will still be profound and painful gaps between schools and groups of students. Inevitably, teachers who have poured their hearts out working with the most disadvantaged kids will feel battered by the public release of "failing" scores.

In our fast-paced, competitive society, it's easy to view the results like sport scores: Identify winners and losers, cheer the folks you like and boo the ones you don't. That may be fun, but it's not terribly helpful.

I'm a certifiable data geek, but I've always told parents that numbers never give you the answer. They just help you ask better questions.

So if you're hoping to make sense of the data storm that's coming, start by reading up on what's to come. Scores on the end-of-year exams will be sorted into passing (grade level or above) or failing (below or well below), with the overall pass rate used as the broadest marker of school achievement. The state also calculates student growth during the school year; a school that started with well-prepared students can end up with a high pass rate but subpar growth. The reverse can be true as well.

The state then applies ABC ratings, which range from "Honor school of excellence" to "low performing."

The federal No Child Left Behind Act parses results into "AYP ratings," based on whether a school makes adequate yearly progress toward complex and changing targets. I've spent years trying to understand and explain those ratings, and I've concluded they carry very little value for families. You essentially end up with a pass/fail label that requires 10 pages of footnotes to clarify, with sanctions for failure that apply only to the highest-poverty schools.

If all that's not enough, there will also be a report on how many of the students who started ninth grade in 2006-07 got diplomas this year. CMS's below-average graduation rate of 66 percent was the bug in the punch bowl of a mostly positive report for 2009. I haven't gotten any strong signals about what to expect this year.

14 comments:

Anonymous said...
This comment has been removed by a blog administrator.
Gilberto at South Park Library said...
This comment has been removed by a blog administrator.
Anonymous said...

In our fast-paced, competitive society, it's easy to view the results like sport scores: Identify winners and losers, cheer the folks you like and boo the ones you don't. That may be fun, but it's not terribly helpful.


What planet do you live on. When I lose it means cash out of my pocket. That is the real world but hey lets let these kids think it is okay to lose. Then it will be no big deal for not applying themselves. So when they lose thier house, job, car family, freedom, etc. its no big deal right??????

Anonymous said...

One feature stability in student assignment would give us would be the ability to look at how this group of kids (for example) with a certain pass rate in 8th grade reading, have improved from their 5th grade pass rate and have improved from their 3rd grade pass rate. I have always been puzzled why we look at this year's 8th grade tests results over last year's 8th grade test results when it is a different bunch of kids.

Anonymous said...

My expectation, based on my high school's scores and some middle school scores I've seen, is that we're going to see some remarkable jumps in pass rates, especially in math.

The real challenge, especially while the PTB roll out their Pay for Performance model, is how schools will be expected to grow these scores with class sizes growing by 10-20%. Once again, regardless of what the PTB's say, we will be expected to do more with much less.

Anonymous said...

I want to know how the charter schools compare.

Ann Doss Helms said...

Charters will be part of the state ABC release Aug. 5. You can find their performance for previous years at www.ncreportcards.org/src/

rjrumfh said...

If you want to see a true measurement of how well students improve on test scores give a pre-test and post test

rjrumfh said...
This comment has been removed by the author.
Anonymous said...

I liked how WC presented their scores. Pretty clear to see the grade and race results.

Anonymous said...

I am curious to find out how NC allows CMS to prepare the students for the EOG's. I read in Dr. Canada's book that in his charter school in Harlem that by the time the 3rd graders take the 3rd grade EOG test that they have already taken it 3 times to practice.

Anonymous said...

90% IN 2014 OR BUST.

Whatever are the reports on grades and performance during the summer, I still relish the big enchilada: 90% graduation rates in 2014.

Dr. Gorman is correct in being sceptical about how grades are reported. There is little glory in being able to say you passed an EOG in middle school while you apply for welfare as untrainable, unemployeed adult.

Bolyn McClung
Pineville

Anonymous said...

Ann, Thank you for your work presenting data. I too am a data nerd and share your anticipation for this year's numbers. I am curious about two things.
1) Will we see seperate score totals for each school's first and second attempts to pass an EOC? and 2) Will we see the the predicted growth versus actual performanace for each school?

As to the first question, at our school, we noticed an additional 5.8% "bump" or improvement when students who failed the first test took the second test. Interestingly, our math and science retakes showed higher gains over English and Social Studies. I wonder if that held true across the district and wonder what that says about those tests.
As for the second question, I think it absolutely critical to examine student growth in evaluating the performance of each school. For example, if Audry Kell has a 98% composite pass rate and West Meck a 70% pass rate, on the surface it seems like A.K. did better. Yet if the students at A.K. scored lower than expected versus major gains that just missed passing at West Meck, then West Meck may have actually outperformed A.K.
Any thoughts?

Ann Doss Helms said...

Most recent Anon: What we usually get is whether schools made, exceeded or fell short of expected growth overall. You're exactly right that schools with high pass rates don't always make growth, and schools with low rates so.

I've been assured we'll get school-by-school results today, but haven't seen the format so don't know if it'll include pre- and post-retest. I know last year Dr. Gorman and his crew were pretty candid about reporting how much of the elementary/middle gains came from retesting.