tag:blogger.com,1999:blog-4020455191286536580.post6301640249234017131..comments2023-10-23T09:23:22.051-04:00Comments on Your Schools: Teacher ratings: Best and worst in Charlotte regionUnknownnoreply@blogger.comBlogger48125tag:blogger.com,1999:blog-4020455191286536580.post-86091861970833027622014-02-28T19:51:21.412-05:002014-02-28T19:51:21.412-05:00As a middle school teacher in CMS, I find this art...As a middle school teacher in CMS, I find this article misleading and confusing. Let's look at your "good" list, #10 is South Charlotte Middle with 51.4% exceeding growth. However on the "bad" list Grier Middle School has 46.2% of teachers who failed to meet the growth standard. Isn't it possible that these schools aren't that different? For example, couldn't 42% of SCMS teachers have failed to meet the target? On the other hand, couldn't Grier Middle have another 50% of teachers who exceeded growth? This is a sad excuse for journalism and a gross manipulation of data. Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-66376975086308398462014-02-24T17:36:20.372-05:002014-02-24T17:36:20.372-05:00When will the parents and teachers get together an...When will the parents and teachers get together and tell all these people to go to hell. No more test. No more intrusion from Raleigh. My sons school is loosing it's best teachers. Why is NC so bad? Why did only the youngest teachers get a raise. I am much better at my job in my 30 s then I was in my 20s. What is going on here?Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-71112684294585599832014-02-24T17:28:45.869-05:002014-02-24T17:28:45.869-05:00Almost as bad as being the parent of students.Almost as bad as being the parent of students.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-25033453561883157722014-02-24T17:21:48.648-05:002014-02-24T17:21:48.648-05:00What a mess, teaching in this state must be awful....What a mess, teaching in this state must be awful.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-73170480120124803922014-02-24T16:10:14.884-05:002014-02-24T16:10:14.884-05:00Thanks, Craig -- that's helpful!Thanks, Craig -- that's helpful!Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-4075618308950065432014-02-24T15:56:48.156-05:002014-02-24T15:56:48.156-05:00Change in Policy for Determining Educator Effectiv...Change in Policy for Determining Educator Effectiveness Status: <br /><br /><br />http://blogofcraigsmith.blogspot.com/2013/12/change-in-policy-for-determining.htmlCraig Smithhttps://www.blogger.com/profile/14127962740780429018noreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-10496050194323329232014-02-24T11:00:31.908-05:002014-02-24T11:00:31.908-05:00And now Michelle Rhee, one of America's top Ed...And now Michelle Rhee, one of America's top EduFrauds, has proposed a new statistic in the VAM model: Grit!<br /><br />That's right. She believes we can come up with a metric and algorithm to measure a student's "grit."<br /><br />God help us.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-14584160006467068122014-02-24T08:58:42.230-05:002014-02-24T08:58:42.230-05:00There certainly has been a lack of communication w...There certainly has been a lack of communication with teachers, which is appalling.<br /><br />The three years is for three consecutive years of teacher "effectiveness" ratings. The theory is that while a teacher rating from a single year may be off (actually, studies show enormous swings from year to year) if there are three consecutive years of "effectiveness" rating, then the swings can be averaged out and a teacher's overall ranking is more likely to be accurate. Studies, however, have not been kind to this perspective, as data remains pretty much all over the map. <br /><br />At MecklenburgACTS.org, we've got an expert working on a layperson-friendly explanation of these (which should be far better than mine) and we hope to have it available in the next few weeks. Pamela Grundynoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-44806595094498828242014-02-24T08:04:42.363-05:002014-02-24T08:04:42.363-05:00I don't see how the EVAAS can be used to track...I don't see how the EVAAS can be used to track "growth" for first year students.<br /><br />Especially since they (SAS) say they need at least three years of data to dampen the noise in their measurements.<br /><br />Of course, that doesn't mean some moronic educrats won't try to misapply a tool.<br /><br /><br /><br /><br /><br />Shamashhttps://www.blogger.com/profile/06886687970259841873noreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-37617492858363424502014-02-24T07:15:40.600-05:002014-02-24T07:15:40.600-05:00I'm always making that mistake too. Hmmmmm.......I'm always making that mistake too. Hmmmmm....Pamela Grundynoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-55166820206260727872014-02-23T23:30:09.856-05:002014-02-23T23:30:09.856-05:00Opps, EVAAS.....Freudian Slip!Opps, EVAAS.....Freudian Slip!Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-7034380528574265842014-02-23T21:28:31.945-05:002014-02-23T21:28:31.945-05:00So are they evaluating the teacher based on how th...So are they evaluating the teacher based on how their class did last year, this year, and next year, or are they tracking each student for three consecutive years? <br /><br />I was a first-year teacher last year and met expected growth, but it looked like on the bottom box of the EVASS ratings the Observer linked to that not all our teachers were counted. I'm guessing new teachers were excluded because it was their first year of ratings? <br /><br />We are being assigned a score of effectiveness based on these things....Wouldn't it be great if someone would actually explain it all to us? Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-67211025882129755242014-02-23T20:18:49.057-05:002014-02-23T20:18:49.057-05:00If you are interested in information on opting out...If you are interested in information on opting out in North Carolina you might take a look at the opting out section of the MecklenburgACTS.org website. Pamela Grundynoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-4794004639766246422014-02-23T20:16:26.372-05:002014-02-23T20:16:26.372-05:00The only logical solution is for parents to have t...The only logical solution is for parents to have their children Opt Out of these tests. It is easy to do and thousands of parents across the country already refuse to allow their children to be subjected to these worthless tests. There are many websites with advice for parents. No test results = no data. Simple enough.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-67701699389434861192014-02-23T20:13:20.233-05:002014-02-23T20:13:20.233-05:00I'll rephrase. First the amount of an individu...I'll rephrase. First the amount of an individual student's "growth" is determined through the enormous database that goes back many, many years. Then the EVASS formula is used to determine how much of that "growth" can be attributed to the teacher. That second number, such as it is, is the "value" that teacher supposedly "added" to the student's score. Pamela Grundynoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-12105124475594417532014-02-23T19:08:52.964-05:002014-02-23T19:08:52.964-05:00Value added can't go back "many, many yea...Value added can't go back "many, many years".<br /><br />It only goes back as far as the student due to actual versus predicted scores. It is suggested that value added shouldn't be used until the third grade and above due to that fact.<br /><br />But, you're right. There are so many variables with value added that it's ridiculous to even use it.Wiley Coyotehttps://www.blogger.com/profile/16966764080565903720noreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-11333012170435440272014-02-23T18:33:26.459-05:002014-02-23T18:33:26.459-05:00Based on accounts of how erratic those numbers are...Based on accounts of how erratic those numbers are, maybe THREE YEARS OF CONSISTENT RESULTS from EVAAS would be a better measure.<br /><br />I'm betting that will rarely, if ever, happen.Shamashhttps://www.blogger.com/profile/06886687970259841873noreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-80505443206580131732014-02-23T18:24:15.523-05:002014-02-23T18:24:15.523-05:00Regarding the value-added ratings, this is my unde...Regarding the value-added ratings, this is my understanding.<br /><br />The published ratings are based on only the 2013-14 tests.<br /><br />The "growth" is determined by matching the scores on those tests against a gigantic database of tests that goes back many, many years. How that happens is extremely complicated. <br /><br />I think value-add was also done in 2012-13. Those are separate numbers. So some teachers may now have two years of numbers.<br /><br />No teachers are supposed to incur consequences based on value-add until they have three consecutive years of numbers (at which point the numbers are supposedly "more" reliable but according to studies still not particularly reliable). And I think they get to pick the best two years out of those three. I'm not clear if the clock started for some teachers last year, or if it's starting this year.<br /><br />No matter which, it continues to be a huge waste of time and money.<br />Pamela Grundynoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-38782423743355636802014-02-23T18:07:12.441-05:002014-02-23T18:07:12.441-05:00And here's the Sass (NOT SAS) study mentioned ...And here's the Sass (NOT SAS) study mentioned before about the instability of model results:<br /><br />http://www.urban.org/UploadedPDF/1001266_stabilityofvalue.pdf<br /><br />Where we earlier found (page 5):<br /><br />"Nonetheless, it is clear<br />that different tests result in different teacher<br />rankings."<br /><br />We also find this gem (page 3):<br /><br />"McCaffrey et al. demonstrate that much of<br />the variation in estimated teacher effects is in<br />fact due to independent student-level variation in<br />test performance over time, rather than changes<br />in true teacher productivity."<br /><br />-----------<br /><br />Well, now ain't that a big DUH!<br /><br />It's the STUDENTS (not the TEACHERS) who are INDEPENDENTLY performing differently on those tests.<br /><br />Who would have guessed?<br /><br />Shamashhttps://www.blogger.com/profile/06886687970259841873noreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-11725696520707731022014-02-23T17:42:32.837-05:002014-02-23T17:42:32.837-05:00The SAS Education Value-Added Assessment System (S...The SAS Education Value-Added Assessment System (SAS®<br />EVAAS®) in the Houston Independent School District<br />(HISD): Intended and Unintended Consequences <br /><br />http://files.eric.ed.gov/fulltext/EJ971428.pdf<br /><br />Page 4...<br /><br />"Even though the district reported that the majority of teachers favor the ASPIRE program<br />overall (Harris, 2011), researchers found evidence suggesting that HISD teachers have aversions<br />towards the program’s SAS® EVAAS®component (Collins, in progress). <br /><br />In terms of reliability, those<br />receiving merit monies attached to their SAS® EVAAS® output often compare winning the rewards<br />to “winning the lottery,” given the random, “chaotic,” year-to-year instabilities they see. Such<br />consistencies are also well noted in literature (Baeder, 2010; Baker, Barton, Darling-Hammond,<br />Haertel, Ladd, Linn et al., 2010; Haertel, 2011; Koedel & Betts, 2007; Papay, 2010). <br /><br />Teachers do not<br />seem to understand why they are rewarded, especially because they profess that they do nothing<br />differently from year to year as their SAS® EVAAS® rankings “jump around.” <br /><br />Along with the highs<br />come much-appreciated monetary awards, but for what teachers did differently from one year to the<br />next remains unknown."<br /><br /><br />------------<br /><br />So, due to some jumping around of the results, the teachers aren't even sure why they get rewarded.<br /><br />Sounds like a REAL EFFECTIVE way to "improve" teaching to me.<br /><br />Of course, we ALL know that there is no way these tests can actually give feedback on teaching techniques, so what can a reward (or punishment) actually accomplish?<br /><br />Since it can't be directly tied to any SPECIFIC BEHAVIOR on the part of the teacher.<br /><br />(This stuff gets more ridiculous the more I learn about it...)Shamashhttps://www.blogger.com/profile/06886687970259841873noreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-64623627857786984322014-02-23T17:25:45.059-05:002014-02-23T17:25:45.059-05:00Ann,
I'm going to do more than one post on th...Ann,<br /><br />I'm going to do more than one post on this...<br /><br />Here's an example of a critique of the "3 tests" in which the tests are NOT THE SAME.<br /><br />http://www.serve.org/uploads/docs/EBE%20Responses/500_Teacher%20evaluation%208.6.09.pdf<br /><br />See pages 6 and 7...<br /><br />"The value-added approach makes an assumption that tests can be equated from year to year or across subjects such that a scale score one year means the same thing the next year. <br /><br />The extent to which differences in tests or forms of tests affect value-added scores is another consideration in interpreting value-added scores of teachers.<br /><br /> Sass (2008) in a study using Florida data reported that ―. . .<br /><br /> it is clear that different tests result in different teacher rankings‖ (p. 5).<br /><br />----------<br /><br />There are several "responses" to critiques such as this which mention a dampening effect of errors when more years of tests are used with THREE YEARS typically being the MINIMUM.<br /><br />SAS is typically unclear on whether the tests need to be the SAME, though.<br /><br />I found a report from HISD which seems to verify the randomness of results from the SAS model. <br /><br />Equating getting a bonus based on the EVAAS as being a bit like winning the lottery, since it's so random.<br /><br /> That's coming up next...Shamashhttps://www.blogger.com/profile/06886687970259841873noreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-65205791935812643122014-02-23T16:46:32.130-05:002014-02-23T16:46:32.130-05:00CMS is using three years of testing, but not on th...CMS is using three years of testing, but not on the same testsAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-6474078572725691712014-02-23T16:08:34.412-05:002014-02-23T16:08:34.412-05:00SAS isn't designing the tests, but they are cr...SAS isn't designing the tests, but they are crunching the results to create the value-added ratings. I am almost certain NC is using three years worth of test scores, but the previous two years are with very different state exams, so it's reasonable to question how well they can project results on the new tests.<br /><br />SAS actually has released the formulas, though they're incomprehensible to most of us. I blogged about it in December, but I just now checked the link in that post that used to go to the formulas and it no longer works. But here's the post:<br /><br />http://obsyourschools.blogspot.com/2013/12/academic-growth-formula-not-secret-just.htmlAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-26242889174579061922014-02-23T14:49:26.337-05:002014-02-23T14:49:26.337-05:00http://www.youtube.com/watch?v=5DxDEoDojJ4http://www.youtube.com/watch?v=5DxDEoDojJ4Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-4020455191286536580.post-60145030915441979282014-02-23T14:44:47.919-05:002014-02-23T14:44:47.919-05:00So who made the decision to go with the SAS tests?...So who made the decision to go with the SAS tests? DPI? Legislators? A former Governor? And how do we get the train back on the track?<br /><br />To quote a famous individual who was way ahead of his time, "Jane! Stop this crazy thing!!!" ~~ George JetsonAnonymousnoreply@blogger.com