Mr. Parello Sensei

Closing the Teach For America Blogging Gap
May 10 2013

Some Unintended Conclusions from Value Added Data

There has been a lot of debate for the past several years over whether or not to use “value added” data as a metric for evaluating teachers.  Washington D.C. is already making 50% of a teacher’s evaluation dependent on their value added rating.  I thought I’d weigh in on the debate, as I happen to have a publicly available value added ranking of myself up on the internet.  According to the prestigious Los Angeles Times newspaper, I am a “least effective teacher” in math and a “less effective” English teacher :

In August 2010, the Los Angeles Times published the value added rankings of teachers within the district who had taught 2nd through 5th grade.  I remained unaware until a year ago that I was on there, assuming that they would not have published the scores of a one year teacher who no longer worked in LA Unified.  Now, thanks to the LA Times, I have a permanent souvenir of that year; the fourth thing that comes up if you Google search my name.

Now far be it for me to defend my first and only year as a corps member; it wasn’t pretty.  I do think the value added score shows that I had a particular problem teaching math that year, and although I don’t enjoy having that fact blared across the internet, I don’t think it’s inaccurate to say that I was far less effective than the average LAUSD math teacher.  In an effort to teach conceptual rather than algorithmic thinking, I did not practice frequently enough the kinds of problems that would appear on the test, and test performance suffered.   In English, I followed the Open Court program very precisely and had an easier time integrating what I learned in professional development, so despite classroom management being consistently chaotic, I managed to be closer to the average for Los Angeles Unified teachers.

But how valuable is that data, really?  On a closer look at the site for this teacher rating project, I found that it is possible to draw some conclusions that really throw the value of the value added rating system into question.  Although I get annoyed by the LA Times’ publication of this data, since it’s up on the internet and has been for some time, I think it’s interesting to see some of the things that can be gleaned from it.  I looked up my former program director from my first year.  Although her value added math score was better than mine, in the average range, I found that her English rating was in the same statistical range as my own:

I can draw two possible conclusions from this: either TFA is hiring unqualified people to lead its corps members, or the value added score doesn’t say much that’s valuable about the effectiveness of a teacher.  Personally, I lean towards the second conclusion.  I’m pretty certain that my program director, with two years of teaching to my one, learned to be a more effective English teacher than I did.  According to the value added model though, she was slightly less effective than me, and I have a hard time envisioning that anyone could get hired to a TFA staff position if that were the case. Nevertheless, since TFA is closely aligned with the movement to use value added data for evaluating teachers, they should be aware that they have hired at least one program director who, by that metric, is worse at teaching English than a corps member who quit after his first year.

On further viewing of the teacher ratings project, I came across a link to the LA Times list of the least effective schools based on value added ratings. Here’s one of the ones I found:

This school has a 938 API, almost the maximum possible, and it is ranked as a “least effective school” by the Times.  Again, this leads to a conclusion that I don’t think the Times intended.  If a school can have a 938 API and still be ineffective at contributing to student learning, it would seem to suggest that school and teacher quality have practically zero effect on how students perform in school.  I don’t actually believe that to be the case, but the fact that the value added date produces that conclusion in turn leads to the conclusion that ranking teachers’ and schools’ effectiveness based on student test data is pointless.

Thank you for this project, LA Times.  I think the conclusions that it leads to should be enough from disqualifying value added from ever being used as a metric to formally evaluate teachers.

No Responses Yet

    You must be logged in to post a comment.