Supreme Court leaves Section 230 alone in case blaming social media in terror death

Originally published at: Supreme Court leaves Section 230 alone in case blaming social media in terror death | Boing Boing

4 Likes

If Google or anyone else is recommending terrorists sites, it’s because they have watched them before. My only proof of this is that I never see those kind of sites or anything remotely like them. Still social media with all their AI super computers should know what’s on their sites and have a way to deal with it. They could do like they did at my job, ban “everything” and if you wanted to look at it, just ask.

There are 30 billion indexed web pages. Yeah not happening.

3 Likes

Already indexed and in some database. How long to scan all those pages, a few days? I’m just saying my company kept porn and other questionable sites off line, I’m sure Google etc could as well, if they wanted to.

They do. It’s called Safe Search, it’s already default. What more do you want?

I’m sure this decision came down only because it would have had an impact on the bottom lines of Roberts’ beloved corporate “persons”. Still, I’ll take the win.

This is one of those rulings that I’ve been really on the fence about. On the one hand, Section 230 is good in that it doesn’t hold Corporation X responsible for hosting anyone’s content. On the other hand, if said content is reprehensible, illegal, or otherwise shouldn’t be spread to a wider audience, should CorpX be free to promote it and share it and make it be seen and noticed? I would argue that their Section 230 privileges end at the point that they’re sharing the content with anyone that hasn’t asked to see it.

In other words, you’re free to host the content, but if you (or your algorithms) are recommending it, then an editorial choice is being made, and it had better be certain that it’s something that you want to be promoting and associated with.

The outcome of such enforcement would seem to promote much more mundane content, which would likely lead to less radicalization, which is quite opposite of the current model, which is directing society towards less civility and stability.

1 Like

That’s patently false and not how the algorithms work. They send you down rabbit holes of vaguely “related” videos which research has shown gradually escalate in severity.

So you’re making a broad pronouncement about a complex technological and social issue intertwined with capitalism based on n=1 and your own anecdote to boot. Cool.

3 Likes

I don’t doubt that the mechanisms actually in use are uncomfortably carefully tuned toward ‘engagement’ above all; but “the point that they’re sharing the content with anyone that hasn’t asked to see it.” seems like a very fuzzy standard.

If someone shoves something into the search box, say, what exactly have they ‘aske to see’? A single item? Anything from the genre of items that roughly overlaps with what they entered? Are ‘related stuff we’d like you to click on next to remain engaged with the platform’ boxes illicit because they are not explicitly requested; or are they legitimate if sufficiently tightly related to what the user is watching but illicit if they wander off topic?

Again, none of this is to suggest that the people doing constant user engagement experiments have your well being at heart; I strongly suspect that they do not; just to note that there’s effectively no ‘neutral’ presentation of anything too large to fit above the fold on a typical monitor; so invoking the ideal of one against which everything else is ‘editorial’ is basically the same as saying that everything is editorial; which seems unhelpful as a standard.

Sometimes they just wear a rut, possibly with deliberate attempts to play the algorithm.

“People who watched this video also watched these videos…”

2 Likes

If someone has followed a link to the content, or if they have explicitly asked to follow the author of the content, then they are explicitly saying “I want to see this content”, and that request should be honored.

I’ll agree it does get fuzzy when you start including things like searches for the content, and especially searches for similar content. While a search for an exact match of the, say, title of the content is easy to argue as a relevant and a non-editorial result. A typo corrected version is also similarly easy to argue. Including something along the lines of “People who searched for X also searched for Y” is, in my view, not an editorial decision because the users are generating that association, not the business. When you’re getting to “related stuff we’d like you to click on next to remain engaged with the platform”, that’s an explicitly editorial choice made for business reasons, not for practical, giving-the-user-what-they’re-asking-for reasons. That’s where I’m arguing the line should be drawn.

Your point about having user-engagement exceed all other measures of success should not be accurate, and they should be taking into account some other metrics, like the well-being of their users, or society in general. Since the market isn’t generating those incentives on its own, there other ways to encourage that, like via laws, regulation, and/or courts.

This topic was automatically closed after 5 days. New replies are no longer allowed.