Want the platforms to police bad speech and fake news? The copyright wars want a word with you.

Originally published at: https://boingboing.net/2018/09/26/doomed-to-repeat-it.html

6 Likes

The real solution is to design social networks to promote exploration and discovery, rather than polarization. That is, stop willfully creating filter bubbles that parrot people’s opinions back to them and push content that may serve to challenge, create a dialog or connect users from different backgrounds.

Then hate speech will be less of a problem.

4 Likes

Don’t buy the whole mastadon when a tusk will do. Don’t police mere content, police content that pushes or advocates behaviors that are corrosive to ecosystems.

Just as an example, doxxing is bad. All doxxing. Forever. No matter who it targets. It’s an ecosystem level problem. Advocacy for doxxing should be an instaban on every platform. (To that end: Eliminate real name policies.) Right now we have a weird system where doxxing is okay if it’s certain people who’ve done certain things. But this disregards the fact that doxxing itself is a corrosive behavior, and it’s an all-or-nothing category. You get to choose if you have it on your platform or not, not what kind you get.

Doxxing is just one major example, but there are others. Overall, I’m not sanguine about the major platforms policing the speech they consider acceptable and I worry that the old regime with all its flaws may prove preferable to the brave new world that we’re hobbling together in an ad-hoc reactive fashion.

4 Likes

I think community moderation is the most effective solution. The worst platforms for abuse on social media are the ones in which moderation is done only by the platform owner.

I know parts of Reddit can be toxic at times, but I think people give it’s community moderation enough credit. On Reddit, the moderators are part of the community. I can talk about being gay, or queer issues on Reddit without worrying about my posts being flagged. Conversely, on platforms like Facebook or Youtube, I can be labeled with hate speech, by potentially only talking about my own life.

4 Likes

Moderators cost money… and somehow one of the wealthiest companies in the world can’t scrape the cents together to make a 9-5 job out of it.

5 Likes

Oh yes as a mashup artist, mashup video producer and podcaster and blogger, I know ALL about the copyright wars.

Thing is, the DMCAs are sent by bots, you contest, and 10 days later the content is back in that grey area (not legally, mashup videos for parody and political uses were made legal here years ago, and Fair Use is in the States, but corporates aren’t listening and don’t want the caselaw) where they might come in the night again…I have a video that has been pulled 3 times this way. It’s stupid.

But it gets even worse - you get false DMCAs - companies that exist to illegally claim other’s work and monetise it - I never monetise out of principle cos hey, it might be my work but not my content, and money should go to artists first…but annoying when some leach tries to take it all falsely.

Also had weird stuff where links have been removed from Google or Mediafire on a keyword basis that has NOTHING to do with content. In one case, had a zip archive banned forever because it had the word ‘Pop’ in it. Someone was stopping some leak of POP3 software, and went and banned anything with a zip file with a similar name across the network…including this zip file full of MP3s. Not even an app inside? Weird.

And of course like Facebook complaining does NOTHING to change it. They who forced me to change my name cos someone I argued with who hated queer people complained - and I get my stuff flagged I think by same regularly and get restrictions - I still cannot change my name even though they ‘agreed’ to allow LGBTQ to have different names for safety reasons (I’ve gotten death threats)…but the same company cannot do anything against film piracy comment spammers I’ve reported, or those who threaten me, or bots? You’re talking to a brick wall.

That’s exactly the dumb sort of AI/bots that will be let loose on moderating speech, and it’ll be a nightmare. AI cannot solve this alone.

I really get the sense these sites are actually run by bigots, hiding under pretence of being progressive - like Apple, they don’t actually care about us by their actions. Their words - they say anything to get us to shut up - but their ACTIONS speak louder, like leaving not even contestable fascist and homophobic content up in their stores and sites…Leaving Alex Jones alone, etc. Only removing these things after mass protests, they do not care about us. They say they do, we love diversity etc - but then they seem to do the opposite cos it makes them money?

4 Likes

Requiring or encouraging companies to use their manipulative prowess to educate you covertly in the opposite views you hold, in an attempt to achieve some mythic “center” is straight out of a dystopian nightmare.

Also who defines the center? If I say the sky is blue, and you say it is yellow, does the Company try to convince us both the sky is green?

If I say “black people are genetically inferior” does the Company try to convince you that they might be a little bit, or does the Company shut me down.

What if the Company believes things that are demonstrably false, like taxes are always bad and inhibit economic–wait you know, they are already doing that one lol

1 Like

For non-political topics, reddit can be great. It’s the only place I can think of that has small communities for any niche interests one might have. It’s really helpful in that regard. For anything even slightly political, though, it seems to be pretty toxic. I don’t mean just the right, or just the left, I mean all of it. Things seem to inevitably move in the direction of shitty little virtual fiefdoms ruled by whichever ideologues are willing to spend the most time engaging in petty power grabs, e.g. sockpuppets, brigades, mass shitposting, etc. It sounds pretty boring to me, but I guess a feedback loop that rewards the politics of emotion can be pretty addictive.

edit: I should add, this seems to happen cyclically with periods of relative calm. This probably has to do with the fact that being a responsible moderator is a full time job that very few people are willing to do for years on end without pay.

2 Likes

The company doesn’t “define the center,” or endorse specific viewpoints. It measures the polarization of its network, and where pockets or filter bubbles start to form, it cross-fertilizes the user’s feed with content that tends to smooth out that cluster. The algorithm isn’t targeted. It doesn’t know who it’s interacting with. Its just knows that there’s a cluster there, and perhaps the mood or intensity of its users.

Ad-driven algorithms that parrot your opinions back to you in progressively more extreme forms are far more dystopian than one that simply tries to expose you to new people and perspectives—which was, after all, what the Internet originally promised.

1 Like

“It measures the polarization of its network”.

Provided we had an ai model capable of qualitatively assessing the content of speech, and balancing the variety of viewpoints objectively and without dependencies…we might have more to worry about than who hears what online in that scenario.

But even then, any programmatic filtering is susceptible to even a loosely coordinated group. How many fake accounts required to radically alter what this ai believes is normal for a small, targeted group?

How often do new ideas form without any hyperbole?

For teenagers, wouldn’t using the internet be akin to constantly trying to argue with a well versed adult wielding authority? Ideas need a certain solidity before they are challenged or they are merely subsumed. The word for that would be indoctrination.

I don’t actually doubt that if we ever get an ai system capable of what you want, it will be used to indoctrinate people into some mythical “center” useful to the corporation running it. But please don’t cheer them on.

“What if you had to work in a business to have equity in that business?”. CAPITALISM GOOD!!!

“I was just asking” BUY THIS NOW!

2 Likes

The problem is that we already have a system that’s manipulating users’ worldviews, and its tendency is toward narrowness and extremity:

“The Constitution guarantees my right to own an assault weapon.” You may also be interested in this Christian Dominionist militia…

How do you mitigate that effect? If you’re having to hire a million moderators to censor hate speech, you’ve already lost

What I’m proposing is an algorithm that does something like this:

“What if you had to work in a business to have equity in that business?” Here’s a business owner who shares your interest in Halo.

The idea isn’t to covertly change your opinions; it’s to make you aware of what the opposing arguments are, and of the humanity of those who disagree with you. Back when we actually interacted with our neighbors in the real world, this was a normal feature of social/community life that has since been eroded by the Internet.

2 Likes

Seriously, they should be punished until they have to hire many many more moderators. It’s a job that can’t be done by robots. Flush the crappy jobs that can be done by robots, like gas station attendant and convenience store clerk, and hire people for the new economy. I’d rather have people learn about copyright than how to stock a shelf.

2 Likes

" What if you had to work in a business to have equity in that business?” Here’s a business owner who shares your interest in Halo."

That’s clever.

“this was a normal feature of social/community life that has since been eroded by the Internet.”

The pedophile hysterias of the 80s, coupled with nightly news of every single scary crime in the country, is what eroded that feature, as suddenly children were not allowed to trust their neighbors. Even today most children are not allowed to go outside. The internet didn’t erode the thing, it grew to fill the vacuum left by our own paranoia.

The core question of what started the thread js, I guess, who owns this system, and are they executed for misusing the system, including for any data breaches?

1 Like

I don’t know how to edit a post on my phone, the execute comment was supposed to be a slightly funny sounding hyperbolic statement, but it sure doesn’t read that way

The community should definitely be involved but the platform owner should be given the final say.

The late Steve Gilliard used to liken his blog’s comment section to a neighbourhood bar with lots of regulars who were valued by the owners and who in turn supported the values of the joint and respected the fact that, as invested as they were, it was owned not by them but by the owners. A good bouncer or two is also vital.

I’ve found that, for the most part, BB BBS works this way, which is why I stick around. Unfortunately, it’s a rarity and even with technology like Discourse that supports this approach I wonder how well it would scale.

1 Like

I’m talking about community moderation, volunteers (free labor) from the people who participate in the platform.

You get what you pay for.

For whatever reason, I was never inside a bubble on facebook and twitter, or at least not one relevant to my political interests. My “bubble” should have featured people interested in Murray Bookchin, and more generally libertarian-socialism, but I seemed to get endless far-right crap.

I would even have preferred to get status quo conservatism recommended to me over that shit.

3 Likes

Blah, that’s just a bad attitude to have. A lot of tech runs on volunteer labor, linux for example. Also a lot of the world runs on free labor, from politics to child care. Don’t shit on volunteer contributions because it’s not monetized.

I’m not shitting on volunteers as a class. I’m taking issue with the idea that they are the solution to enterprise level externalities created by profitable corporations. We need people to clean up oil spills, too, but volunteers by themselves can’t (and shouldn’t) do all the work.

There’s volunteer labor and there’s exploitation. A dude running a phpBB forum has earned the right to free beverage from me, in appreciation of their hard work. A person on YouTube working tirelessly to improve the environment for a company that pulls 110 bill in revenue is getting screwed. Different environments call for different levels of moderation, and a company like YouTube has the money to pay for the level of moderation they require.

1 Like