As the Supreme Court eyes Section 230, here's what's at stake.

… like print media has always worked?

5 Likes

I do understand completely what you are saying and also how 230 works and I’m only confused about how what I said could be interpreted exactly backwards from the point I’m making. But it’s my Friday and I’m tired and I just don’t really feel like rephrasing it all. So whatever. I support your fight. If you want to have it with me to further the cause I can be your bad guy today.

2 Likes

For publishers yes, but that is no different than things have always been or are today. Publishers are and have always been responsible for things they publish.

Section 230 allows companies to host comments and posts from their users by exempting them from liability. In general, they have no liability at all for legal content their users might post, even in civil suits. They have to make good faith efforts to moderate/remove illegal content, which in the US is a very tiny bit of stuff (child abuse images and direct calls to violence, but you know how that goes)

Section 230 isn’t about publishers (unless the publisher has a comments section), It’s about allowing web 2.0 sites that generate profits from the content their users generate.

For example:
Boingboing is responsible and can be held liable for everything posted on the main site, because they made the decision to publish it.

Because of Section 230 Boingboing is not responsible for anything illegal or slanderous that any of us say here in the BBS, so long as there is a good faith effort to remove illegal material. Libel and slander posted by users they are safe harbored from

Boingboing moderates the BBS beyond that because they want advertisers and because they want it to be a nice place to visit

I wish I had thought of this obvious example earlier because it really covers it all

6 Likes

One of the consequences of Gonzalez winning is that any platform with a recommendation algorithm would be classified as a publisher in regard to user content. I don’t think that’s going to happen but if it does the follow-up will be an immediate attack by right-wingers on moderation policies (whether the platform has an algorithm or not), the argument being that they’re essentially editorial gatekeeping for user comments and other content.

So if a BBS has a rule that says “No promotion of fascism”, the right will argue that it’s no longer protected by Section 230 (or some successor safe-harbour law) and send out its flying monkeys to lawyer the forum out of existence.

In this scenario all public-facing social media forums are not economically viable. Which is the ultimate goal of the fascists.

5 Likes

Yeah, I meant Section 230 as written and understood for the last 3 decades.

And I actually do agree that algorithmic recommendations are editorial speech - which makes them protected under the 1st Amendment, so I don’t even understand what they are trying to do here other than tee up complete removal of Section 230 down the road, something that for reasons I cannot fathom has support on both sides of the aisle. That was why I ETA’ed the earlier comment

If you get rid of Section 230 the 1st Amendment would still let you recommend and promote any legal speech, which in the US includes almost everything, no matter how awful

Boing Boing’s rules and human moderation choices are also protected speech., 'natch

But, as you say what they want is to open the door to harassment suits against people that won’t let them scream fire in a (privately owned and operated) theatre

3 Likes

Suspicious Monkey GIF by MOODMAN

Kristen Wiig Yep GIF by Where’d You Go Bernadette

Not so much the former. User-generated-content is toxic to advertisers precisely because they can’t be sure what their ad is going to show up against. The BBS is not revenue-positive directly, just community-positive (and IMHO internet positive, too)

11 Likes

Yes, the BBS is a community and internet positive indeed!. I only mentioned advertising because if the BBS was an uncontrolled festering pit then advertisers would also be inclined to avoid the front page.

I definitely do not include boingboing in “web 2.0 sites that make money off of user generated content”. That was intended to be more about the sites where that is all they do - social media sites and so on.

Boingboing actually creates stuff and provides me with valuable tapir related information (finally :roll_eyes:)

Edited because I am tired

7 Likes

This is also why Musk’s free speech absolutist stance on Twitter makes no sense. The whole business model is based on ad revenue from big brands, and there’s no user content more toxic than than that produced by the fascists and racists and misogynists he’s welcoming back.

Take away the Section 230 safe harbour due to their use of the algorithm the business model is also reliant on and they’re done (the same goes for the Zuckerberg platforms and, to a lesser extent, for YouTube and TikTok).

Malice on the Republican side as usual, ignorance and Useful Idiocy amongst some on the Dem side.

5 Likes

Nothing so groovy. More like 1970s network TV or maybe 1980s cable.* User-generated content, for whatever purpose, will be gone.

[* Actually, it’s worse, because the gold standard for news won’t be CBS but Faux, and network standards and practises will stacked with Anthony Comstock types for fear of offending the Xtianists.]

3 Likes

prevent or protect?

1 Like

He’s quoting me. I wrote prevent, but did mean protect. I went to edit it earlier, but Robert had already quoted it, so it’s there forever unless I change it and bribe Robert to change it in the quote :man_shrugging:

5 Likes

… and yet YouTube exists

( and i hear it makes money )

4 Likes

I’ve gotta assume it’s a down to reach and because ads are run against individual videos - as opposed to a thread of posts that could contain anything - so advertisers feel like they are more in control.

Personally I don’t understand why brands advertise in a lot of places. The internet is terrible

ETA: Which, I just realized doesn’t directly address your question. I think that the reach of youTube is huge enough that brands don’t mind being on the same site as Andrew Tate videos, as long as their ads don’t show up against them. Boingboing, without that reach, would probably have a harder time if the BBS was turned into the He Man Woman Haters Club.

Also, I think that youTube is still the only place it’s size on the internet that old style TV ad people feel comfortable. They just make TV ads and they get put into a video, just like the old days. The other video formats like TikTok are a little different and probably scary and confusing

2 Likes

Unless it’s their social media platforms.

3 Likes

On the plus side the internet would still be there. We would still have our directory of mostly wonderful things. On the minus side we wouldn’t be able to post our opinions and experiences relating to them. The internet as we currently know it would be gone. :frowning:

2 Likes

I think what @zachstronaut is saying is that the large platforms would be able to afford enough automated moderation along with hand picked allowed users (think selected op ed writers in the paper) to keep working. And that very small sites would be able to manually review every item. The theory being that with this level of moderation, they could accept the liability for all the content.

For the very small, I would agree. If I host my own web site and blog (not using some platform), and then provide a comment section. Then, I personally review every submitted commit before making it visible. I could accept liability for all of those comments, since I reviewed each one before posting. That would only work though because the number of comments is likely in single digits. It isn’t very community like either.

For the very large, I disagree that it is possible. There is no way to create automated moderation or manual review at scale. The automated moderation would need to throw out so many comments that the remaining could be manually reviewed or would be assured to be so bland to not create liability. Either way, it would destroy the economics of even the largest sites.

If we assume Twitter could be liable for the content of every tweet, they would need to pour money into automated moderation that would simply trash and not show the majority of tweets. They could manually review the trickle that passes automation. Alternatively, they could approve and have relationships with some people to tweet. Those aren’t users anymore, they’re employees. In any of those, Twitter would no longer be Twitter though, just another web publication that nobody would read anymore. Who would waste time tweeting into the abyss if almost none of them are ever displayed to anyone.

4 Likes

Agreed on your analysis!

I also think that we need to take into account the difference in the online environment these days. In the old internet before Section 230 there just weren’t that many people around, there weren’t really good search engines, etc.

These days everything everyone posts is much more visible and accessible. If one of the posts that you hand moderate has something in it that you don’t realize is potentially bad (like a libel or slander kind of thing) the chances of it being found are much higher now, so, as you say, you would only accept the most bland and innocuous comments

Though you could always set your robots.txt to make sure it is not indexed and therefor not get any visitors…

The bill they are trying to pass in Florida provides a terrifying look at how this will shut down discussion of many topics completely. Who is going to host any pro-trans message boards if it opens them up to being sued by Rick fucking deSantis because posters quite correctly call him transphobic, opening the site operator up to 35K of liability each time?

It’s not a return to the freewheeling internet of the 90s, which after all, existed under and because of Section 230 which passed in 1996 - just three years after the launch of Mosaic (which most people had to use with a slip connection, so not exactly like todays internet)

6 Likes

This reminds me when social MUDs like FluffMUCK had specific legal boilerplate. I actually need to talk to Zorin (owner of FluffMUCK) about the reasoning for it. I suspect it was due to pre-230 rulings and vague local laws of Florida. I can only see the repeal of 230 basically making most MUDs close or move to overseas hosts to avoid liability which would spell the end for MUDs as a medium in my opinion. I don’t think folks realize how much good comes from 230 that it prevents assholes from suing for silly reasons with respect to content moderation.

6 Likes

If it passes and works as they desire, it will shutdown all discussion, not just about those topics. There’s no way to be sure that on any random comment section someone isn’t going to call Ron transphobic. It doesn’t matter the topic of the board.

Take for instance a newspaper or TV station in FL. They’re not going to have a comment section on any story at all. Just having it would almost certainly attract someone calling Ron transphobic in those comments and open them to liability. It will suppress all speech, which is the point.

5 Likes

Agreed! That is what I really expect. I was toning down my general “doomsday” scenario to focus on one current issue that would, as intended, erase trans people from the internet. Especially because the host and poster don’t need to be in Florida - just the person being called out for their bigotry

3 Likes