Month: February 2012

  • Teacher Data Reports

     
    from the NYTimes Teacher Data Reports

    The internet is abuzz today, and has been for several weeks as the release of the Teacher Data Reports became imminent. The NYTimes went ahead and released their reports last night - ahead of all the other news media - but in a week, the public will be awash with inaccurate numbers that don't teach us much. If you've ever heard the line: "57%, no 34%, of all statistics are inaccurate," well, now's the time to believe it.

    I'm usually not one to get all up in arms about educational issues. I do have strong opinions, but the problems are usually so large, that it's not worth trying to resolve the rhetoric. The release of the TDRs, however, has gotten me fuming mad. Let me extrapolate.

    First of all, I started digging into WHO wanted these reports in the first place, and there is an interesting void of information. Initially, it was the New York City Department of Education that rallied for them, and they had been involved in a court case with the United Teachers Federation (aka "The Union") to allow the release of these numbers. The DOE is full of "reformers"who have not been in the classroom for years, and yet profess to know what makes a good school run. They have flip-flopped the educational systems of this city every time there has been an administration change - sometimes overhauling entire systems - only to yield similar results. Interestingly enough, now that the numbers are finally released, and the wide margin of error (more on that later) has been revealed, they are now DISTANCING themselves from the very reports they fought so hard to release. In an e-mail to DOE employees, Chancellor Walcott says, "the courts have said that we are legally obligated to release Teacher Data Reports," in an attempt to distance the DOE from the ensuing debacle (Letter from Chancellor Walcott). That's like giving a bully a big stick and then denying any involvement when someone comes in with a black eye.

    Now, let's talk about these numbers. I think you're really going to need a lesson in statistics if you really want to hope to understand what all of them mean. I've highlighted a random report above to show you what the report looks like. At face value, the public is going to want to look at the bolded numbers and think 'high means good, low means bad.' If only it were that simple. I'm glad that they at least thought to put the sample sizes in. The first pair of teachers look to be going phenomenally well compared to the third 4th grade teacher - until you look at their sample sizes. The first PAIR (just so we know, that means two) had 19 students that determined their Value-added rating, while the last teacher, solo, had up to 102 students giving her a negative Value-added rating. Given that mandated class size can get up to 34 students in NYC, I don't see how it's fair to compare such a vast range of students.

    Data aside, you should know how data is calculated in NYC. Tests. Standardized tests. Though they are constantly reforming tests, a lot of private universities have debunked NY State tests, and prefer that their applicants take nation-wide standardized ones. This is because NY State tests are ridiculously easy compared to state-wide standards. The cold hard fact remains, NYC public school students from predominantly poverty-high areas do not perform well in school and standardized tests. The bar has been lowered and lowered instead of addressing this fact because it's easier to blame the system than it is to try and fix systemic inequality and poverty.

    My only experience with this has been at the high school level - offering French Regents exams. The day before the exam, several students informed me that they weren't going to take the test. They didn't feel ready, and this test was optional for their graduation. As a teacher, I understood and valued their opinion. As a NYC teacher, I fretted because those tests would show up as failures on my record. The day of the exam, a student overslept and missed the exam. Another failure on the part of the teacher. All totaled, only two of my students did not pass the exam, but my overall data for that exam showed a much lower return. It's so bad, that on the morning of the exam, my colleagues are calling students to get them out of bed, or they are even telling students who aren't likely to pass, not to even sign up for the exam. And did I mention passing is only a 65? Yes, you are given a green light if you score a 66 on a state exam - even though that may show that a student barely grasps the material. No matter, off to college you go. Keep in mind, that this is some of the stuff that goes behind these numbers.

    A teacher complained on the NYTimes TDRS:

    “This data is based on ONE test taken on ONE day when several variables, such as child poverty, quite possibly will affect student performance,” Lea Weinstein, a teacher at Middle School 45 in the Bronx, wrote in a response to her rating that she sent to The New York Times. “Yes, I administered this test that generated this data to my sixth-graders two years ago. I no longer teach sixth grade, and I no longer teach in the same school, or even the same subject. How is this data relevant today?”


    This leads me to my next complaint. The media. The fact that the NYTimes was asking teachers to submit a "response" to accompany their TDR is troublesome to me. I refuse to let them off the hook because they offered teachers a chance to "plead their innocence," when they were really just keen on publishing the reports. The fact that they published them the very evening they asked teachers to give an accompanying report proves that. I'd like to see the percentages on their return rates - do they assume that all NYC teachers read the NYTimes? And other media publications will be soon to follow. This is good for the NYC DOE because they are using the media to take the real blame for wanting the reports published. ("After Championing Release, City Says It Did Not Want Teacher Data Public")

    The future of the TDRs seem to rest in the reaction of the public. How are parents using this information? What does the general public think? How will teachers react? When Los Angeles (the only other city to release TDRs) released their reports, teacher Rigoberto Ruelas committed suicide, some claim, due to the low rating he received. He was already having issues with depression, but it's telling that his death coincided with the publication of these reports. When I first heard that, I thought it was a little absurd, but now that I'm seeing my colleagues names in print for public scrutiny, I can feel how utterly uncomfortable the reality is.


     
    UFT Ad in the Daily News