A lot of bias is objective.
It becomes discrimination or *-ist when you ruthlessly apply statistical facts to individuals.
Whenever you discriminate against a group X because they are more likely to be/do Y, that's unfair towards those members of group X that aren't/don't Y.
Young men are more likely to crash their car than young women are (statistical fact, at least in Austria). But making someone pay higher insurance premiums because they are male is discrimination. Why should a careful young man pay more than a careless middle-aged woman?
On the other hand, women are likely to live longer, should they therefore get lower retirement benefits?
[...] provides a veneer of objective respectability to racism, sexism and other forms of discrimination.
I think that is the wrong problem. The problem is not that these algorithms make things seem objective. The problem is that too many people think that "objectively justified discrimination" is respectable.
If the police are stop-and-frisking brown people, then all the weapons and drugs they find will come from brown people.
It is of course nice to defend "brown people" as a group, but what if it turns out to be true after all that the crime rate in one racial group is higher than in another? Is it then OK to stop and frisk people just based on their skin color?
I'm hoping that clusterism will turn out to be less bad than the traditional *isms, because two different machines will likely come up with different clusterings, whereas two different people are likely to divide the world into roughly the same races, ethnicities, and genders.