The Arcade Fire – Ready To Start
In the spirit of finally wrapping things up in this series without having yet taken a definitive stand on much of anything, I offer Five Propositions on the L.A. Times value-added analysis for anyone who may care.
1. The very notion of a value-added score needs to be clarified for all personnel within LAUSD. Just this week, in speaking with two colleagues, I quickly determined the three of us had three differing conclusions of exactly how a value-added score is calculated. If teachers don’t understand how their score is calculated (explained by someone from LAUSD), then any potential for using that data effectively is now gone. Ignorance is not an appropriate response.
2. The validity of a value-added score needs to be definitively addressed by the district. There are enough concerns about the method behind it that warrant further investigation. I’ve yet to delve into the actual statistical modeling, but a close colleague of mine who did raised some incredibly valid concerns about the mathematical modeling involved.
3. “Value-added” (or any usage of student test scores) needs to be used primarily for internal purposes of supporting teachers. We should be using data to help us learn to teach better, being able to identify specific teacher qualities or actions that lead to student success. LAUSD should immediately release value added scores to teachers (which is happening), and train school leaders to facilitate conversations about how to use those scores to improve instruction (which isn’t happening).
4. Value-added data should not have been made public within the Times database. The Times should be rebuked for what can be seen as journalistic irresponsibility. As I mentioned in the previous post, publishing the database added nothing to their substantial series of reports on teacher quality.
5. Despite the Times’ irresponsibility, the UTLA boycott is stupid. Boycotting the Times makes even less sense than publishing the database. A UTLA boycott will never stop the Times from publishing whatever they want, whenever they want. In fact, the union needs to lead the conversation about teacher effectiveness rather than simply be reactionary against whatever is presented.
So given these propositions, what should happen next?
For the next three years, 2010-2011 through 2012-2013, LAUSD needs to (a) tweak the value-added method to be (or find a new test-score method that can be) more statistically reliable, (b) disseminate that information privately to schools and teachers, (c) provide administrators and school leaders with tools to help translate low scores into better teaching. And that’s it.
If then, after the next three years, value-added has been tweaked and developed enough to accurately help teachers improve their practice, steps do need to be taken to address teachers who fail to improve. Obviously, and as most have suggested, a value-added score should only be used as a partial measure for making continuing employment decisions. It has been suggested that at least 30% of a teacher’s effectiveness be based on value added, which then raises the question, 30% of what and what is the % for “passing”? There has been far too much rhetoric thrown around by all involved without actually coming up with definitive steps to use data to actually help kids learn more, and teachers teach better. A three-year delay before we use value-added in making personnel decisions allows a nuanced plan to be developed with all parties (including parents, which I’ve thus far failed to mention) be involved to actually ensure that the teaching corps in Los Angeles is one that is getting better all the time.
Previous Posts in my response to “Grading Teachers”.
Part 1: In the Beginning
Part 2: The L.A. Times: Reporting or Creating the News?
Part 3: United We What?
Part 4: A Line in the Sand
Part 5: On Blast