Training bias in AI "hate speech detector" means that tweets by Black people are far more likely to be censored

And orange people are LESS likely to be censored.

1 Like

Blockquote
I’d like to point out that I observe people daily who are white, asian, and hispanic communicate exactly the same disrespectful way.

The problem isn’t “texting while black.” It’s using racial slurs in everyday communication. Frankly speaking, it’s time for everyone to stop doing that.

It may not be a popular opinion but I don’t see this detection flagging as a problem. I would not be sorry that someone else’s decision to be rude would cause them inconvenience.

Edit: Don’t get me wrong, this entire thing is a surveillance dumpster fire, but if the worst side-effect is that people have to stop casually using the N-word, etc. to send messages promptly, I’ll take it.

good luck changing several billion people to your world view of what is an offensive word/racial slur.

In the meantime, lets live in the real world and try to not exclude huge sections of the world who do not talk the same way that you do (whatever way that is).

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.