As the Supreme Court eyes Section 230, here's what's at stake.

Originally published at: As the Supreme Court eyes Section 230, here's what's at stake. | Boing Boing

9 Likes

The Moderated Content podcast has good stuff on this stuff. At least to me someone who is not expert in US law but is familiar with some of the law around this.
https://moderated-content.simplecast.com/episodes

12 Likes

So far the court seems unconvinced by the arguments in Gonzalez for chipping away at Section 230; they’re so weak that I’ve seen some experts asking how the case got to be heard at all by the SCOTUS.

The outcome of Taamneh might be some very narrow and specific exceptions to allow people to sue platforms over user content that promotes terrorism. The problem here is that bad actors will use that as a pretext to launch expensive nuisance lawsuits against platforms they don’t like. That could shut down a smaller BBS like this one (even though it has the EFF in its corner, BB still has to pay lawyers every time some disingenuous arsehole serves them with a paper).

Right-wing politicians are also looking to weaken Section 230 insofar as it allows platforms to have moderation policies at all; they don’t like it when fascists and bigots and misogynists face consequences for posting their garbage.

18 Likes

Justice Brett Kavanaugh summarized Waxman’s argument this way: “When there’s a legitimate business that provides services on a widely available basis … it’s not going to be liable under this statute even if it knows bad people use its services for bad things.” -APNews

Why do I have a feeling they are thinking about gun law here?

18 Likes

SCOTUS has proven itself to be ideological above all else now, so we must assume ideological motivations for everything, including which cases they choose to hear.

As you surmised, I would guess this is about giving people an “in” for challenging removal of far right content from social media.

21 Likes

What I would like to see, however this shakes out, is more platforms explicitly defining far-right user content as terrorist content in their moderation policies. A person advocating for fascism or white supremacy or theocracy or male cisgender supremacy is always effectively arguing in favour of political violence.

23 Likes

Rest of the world:

okay-bye

8 Likes

Something about the lack of common carrier ideals makes Section 230 problematic. Seems bad that you can hide behind 230 while actively ignoring your own self imposed community policies and boosting harmful content.

That said, this seems like a social problem not a legal problem. And I agree that removing 230 could just make medium sized online communities completely a thing of that past. You’d only have the giants and those small enough to avoid legal scrutiny.

3 Likes

Well you know exactly why this is in the SC. Interesting thing is that many of the arguments in earlier courts on this issue (I have been following a bit) relied on arguments that entirely contradicted the stated purpose of the legislation “this clause is intended to do this…” it wasn’t, and it didn’t, so the crazy nazi factor is strong in this suit.

9 Likes

That’s a really good way to put it. There’s no way to argue against the existence of another person without that line of argument ultimately leading to murdering said person.

Deep down, I think the people making these arguments know that, too. It’s why they always try to lean on “mental illness” as an explanation for their position. If you believe gay people shouldn’t exist, then you need an “out” other than murdering all of them. Claiming it’s an illness is really the only one they’ve found so far. “Moral weakness” used to be the go-to, but when you’ve got world renowned philanthropists and people like Alan Turing being outed now, that argument falls apart. Nobody believes they are bad people, so mental illness is all they have left.

That argument is getting weaker as well, as generation after generation of the psychological and psychiatric professions debunk and dismiss that idea. That might be why actual violence is increasing once again. It’s all they have left.

9 Likes

On the downside, this could end the internet as we currently know it.

On the upside, this could end the internet as we currently know it.

20 Likes

Perfect! I was coming here to post “On the upside, no more social media, on the downside no more Internet at all!”

I assume that they must know this because it’s obvious. Removing section 230 doesn’t remove moderation from a legal stand point - that is already covered by the 1st amendment - editorial choices are obviously speech. All 230 does is protect hosts from being liable for illegal, slanderous, or libelous material their users post, with a promise to remove it.

Platforms moderate material that doesn’t meet the standard above because they want to have users and advertisers and shape the kind of community they want.

Removing 230 doesn’t end moderation, it just makes it impossible for any company, no matter how large, to stay on the internet with user generated content of any sort. The company can directly be sued by anybody for the the content their users post (and right now you can sue a user for a slanderous or libelous or prosecute someone for illegal actions online).

So, I have to assume that this is an attempt to end the internet as we have known it completely.

ETA: I also just want to make it clear that I am not just talking about today’s Supreme Court stuff. There has long been a movement to eliminate 230 completely on both sides of the aisle - Biden has expressed support for doing this in the past

10 Likes

While I know our legal system is far from perfect, it seems like an anti-SLAPP feature could be built right into that part of the law. If you fail to prove your point, you pay for the defendant’s legal fees. If your suit is shown to be frivilous, you owe the defendent 10x their legal fees.

7 Likes

The giants have more exposure both criminal and civil, the more users you have the more that are posting stuff that you could be sued or arrested for. The small ones would be at risk from any user posting a child abuse image, which I promise you, will not go unnoticed or prosecuted.

NO ONE will take on that risk, big or small. The bbs would be gone, I promise you.

What this really is, I am assuming, is a way to try to get back to a publisher/broadcaster model where corporate interests publish things and everyone else consumes them. It’s to shut everyone else up and regain control of the messaging

ETA: and no one is hiding behind 230 to allow hate speech. 230 is concerned with illegal speech. Because we are a broken country hate speech is legal and protected by the 1st amendment and would be unaffected by the removal of 230. The reason that sites exclusively promoting hate speech like Stormfront end up on the TOR dark web is not because they were driven there by the government, it’s because no ISP will do business with them

7 Likes

Speaking of SLAPP like features

3 Likes

The only way that will happen is if the big tech platforms think it’s necessary. They’ll be opposed all the way by right-wingers claiming it’s censorship.

2 Likes

Unmoderated bbs maybe would be gone. The giants and the small both have the benefit of being able to handle moderation. The vast middle where you cannot afford to moderate the volume of content would be who is most affected. Just like with print media, anybody would be able to publish anything they want but they would take on the legal responsibility. So you end up with a few consolidated giant outlets that publish safe crap. And then you have the tiny papers and zines almost nobody reads. But they do exist. And they are subversive and lovely.

So in your scenario maybe boing boing bbs dies but hey you could have whoever you want over at edgore bbs. It’d be like 90s internet.

I’m being a bit flippant with all this. I guess I just can’t really envision the total doom scenario you kind of paint. The genie is out of the bottle when it comes to online communication. And besides, hopefully my donations to EFF and ACLU lawyers can prevent the worst outcome here. I wish them luck.

2 Likes

Incorrect. 230 is what allows moderation to prevent a company from liability.

Without 230 a host is directly liable themselves for what their users post, they are treated as a publisher, without a safe harbor cut out

Real world example - in the 90s a friend of mine made a post on the Yahoo Finance Message Boards about a company, saying that it was not a good investment. We both worked for a well known financial company at the time and he was very well known in finance and technology

The company sued him, the company we worked for, and Yahoo! for libel. The court threw out the suit against Yahoo, citing the protections provided by 230. The suit against my friend went ahead, while the suit against the company we worked for was dismissed by the court since he was not posting from work

What you are saying is NOT how this works.

I would suggest that is because you appear to completely not understand what section 230 actually does. The scenario you proposed above about publishers has nothing to do with Section 230, I would recommend going and reading up on it. The EFF explains all of this very clearly in many, many things they have written

11 Likes

That’s what a lot of people said about the prospect of SCOTUS overturning Roe v. Wade.

14 Likes

This Up Here GIF by Chord Overstreet

6 Likes