Moderation policy change: unfounded assumptions

It’s worth pointing out here that this attitude tends to be the key component missing from moderation at toxic dumps like Facebook and Twitter. They try to cheap out with clunky and inflexible automated systems that generate lots of false positives and/or with human moderators who are “trained” by being ordered to blindly follow guidelines (usually written by highly risk-averse lawyers) that are simultaneously overly strict and overly permissive.

Rules and standards and technology systems should exist to support human moderators who are invested in the community and its values and who are trusted and thus empowered by the publishers of the sites they keep an eye on.

12 Likes