American education's use of "value added measures" is statistically bankrupt

Yes and no. Here’s a real world example:

My wife, a science teacher, wanted to analyze the standardized test scores for the entire 6th grade where she taught (given 3 times a year). She wanted to see the results of an unofficial experiment she had been running for years. Because of lazy counselors, she was the person who made class assignments. For years she had been manipulating class assignments to get the worst students assigned to her. At one time she had 26 students with IEP’s in one class. 11 more than are allowed to be in a class by law. About 15-20 in her other classes. She had several average and gifted student too.

So you’d think her kids would learn slower than those of the other teachers, right? Dead wrong. Her kids progressed an average of 2 years in her class. Progression was higher the lower the student initial score was. Lowest progression was with her top kids, but at least they progressed.

Progression averaged 1 year or less for all of the other teachers. Worse, the advanced students declined.

Comparing students with similar initial scores between classes, my wife’s student’s still doubled the improvement of similar students in other classes.

The conclusion is that the proportion of slow learners does not necessarily correlate with lower average improvement (and can be a negative correlation). At a gross level, it is possible to tell if a teacher is very effective, ineffective, or average.

Other conclusions were that standardized testing can be used to find areas of weak understanding and work on those areas with students. Not by teaching to the test, but by doing hands on projects and open ended questions that result in depth and breadth of knowledge that make students flexible to handle any kind of testing.

Testing is a tool. Used incorrectly, it’s worthless. Used correctly, it can be powerful. Unfortunately, it’s almost always used incorrectly.