Hate site KiwiFarms given the boot by web services provider Cloudflare

Was Cloudflare hosting the website, or just providing web security services? I’ve read conflicting reports.

They were providing multiple security services for the site, including some privitization services. This is the reason why for a long time people who were trying to legally serve the owner of kiwifarms had trouble finding WHERE to serve it.

7 Likes

One of the problems with regulations is you need a group of really smart people working together for the common good to put them together. The GDPR, for example, was drafted over a period of years with input from industry, special interests (read: EFF), any many others, including the public. While a company could do that in theory, I don’t think that’s ever happened before. They focus on what’s right for them, not necessarily what’s right for society.

A corporation could create sound regulation I suppose, but in the overwhelming majority of cases, it’s taken governmental (or extra governmental) SIG’s to really draft sound regulations on this scope.

3 Likes

The thing is that these companies do have ToS, which they could just enforce. It’s a simple matter of contract law because it’s right there in the agreement that both parties signed on to.

6 Likes

I don’t get the reluctance to call a nazi a nazi. Fuck em, expose and de-platform them all.

13 Likes

Say less,’ as the hip, young folks like to say these days, when they are in total agreement.

Call it what it is:

15 Likes

Yessssss.

6 Likes

I covered that above. Those corporate policies are used today by very-much-not-theoretical attacks on “good people”: they are regularly used to remove or otherwise silence “good actors” because they violated some rule the company put in place and used against a bad actor, as a bad-faith attack by bigots and extremists.

I know. It happens to Boing Boing. It happens with Andrea James’ sites that I host. It’s precisely why corporations and corporate contracts cannot be the way we regulate hate on the internet. It has to be bigger, it needs the same sort of thought (over a literal decade) that was put into privacy regulation. Cloudflare’s legal department should not be the ones drafting the rules that the internet will be governed by. For all the reasons I outlined above.

This isn’t a thought exercise. Vulnerable voices and safe spaces are under siege using these very TOS “rules” and actions by these companies that were originally made in the best interests of these same vulnerable populations! Bigots and assholes will always try to weaponize these “rules”, and often they have the money or influence to do it. This is why we need strong, intelligent regulation in place of one-offs from corporations.

5 Likes

I’m not arguing against regulation, but as you said, that takes time.

I think that if legal departments drafted the rules, it would be a step up from what we have now that could save lives in the meantime while we wait for regulation.

The problem is that legal departments do not actually write the rules. The selective and downright capricious enforcement of ToS (as you mention, to placate whoever screams loudest) tells us that those are not really the rules.

4 Likes

Basically, CloudFlare was giving them DDOS protection. Without it, now, their site likely won’t last long.

9 Likes

a privilege exemplified by the ability, in this case, to ignore the horrors directly before them because they are unaffected. such a treat . . . like cyanide dumplings for the spirit.

8 Likes

fundamentally, i agree with you. especially about who should tapped to help figure this sort of thing out, and also this:

to me - bad analogy time - it’s like gun control.

should we leave it up to the gun manufacturers to create regulation? absolutely not. should we just rely on individual recognizance to stop gun violence? clearly, no. we need gun regulation, and we need it now.

however, we also shouldn’t concede the idea that profit maximization alone is good and proper behavior for a gun company. nor should we concede the culture that glorifies gun violence.

we give companies a free pass if we wait till stakeholders can get it right. especially because we know that the profit maximizing mentality will work to delay and undermine that effort at every turn.

we should clearly articulate our expectations, we should shout and holler, we should boycott ( when we can ), we should steer our money towards those with good governance principles ( esgs, etc. ), we should demand better.

no one thing is perfect. it’s all needed.

10 Likes

He looks like a Grant Morrison character

… or, that is, I guess Foucault is obviously a huge influence on Grant Morrison :thinking:

7 Likes

This lack of consistency and selective implementation of the “rules” is a huge problem in social media. I’ve reported horrendous examples of carefully-coded threats of violence and blatant transphobia to Twitter, only for them to turn around and tell me via automated e-mails that “so-and-so hasn’t broken our safety policies.” Meanwhile, other accounts championing trans rights and other social issues have gotten suspensions and bans over fairly mildly-expressed disagreements with bigots.

Twitch has the same sort of discrepancy in moderation. Haters mass-report leftist channels and have gotten streamers suspended for no truly valid cause. One creator got a timeout for reporting on the “don’t say gay” bills, supposedly for “hate speech”; while I didn’t see that night’s broadcast, I know his content well enough that I cannot believe the accusation was true. Another had a 24-hour ban for a wardrobe issue (which was pretty silly, as everything that needed to be covered was covered… but given the ban took place shortly after she’d gotten into an argument with an under-bridge-dweller, I don’t doubt it was the result of malicious reporting.) I’ve heard streamers say the reports they receive on said “violations” are so vague, they can’t always figure out exactly how they broke TOS, and the review process for appeals can take a month or more. For those who pay their bills by streaming, that’s a huge hit on their finances.

Moderation is useless if it isn’t applied consistently and transparently… especially when it fails to protect the least privileged users and favors their tormentors instead. I can understand how tricky navigating legalities can be for smaller sites, but let’s face it, the bigger players in the game only seem to accept and follow through on their responsibilities when forced to through external pressure and publicity. It’s hard not to wish for some sort of lawful remedy to the situation… but I acknowledge there’s no simple and easy solution.

19 Likes

It really is a crisis of corporate governance. ToS exist on paper, but they are not being used.

These decisions are being made not because of ToS, but despite them. The problem isn’t that they have these rules; it’s that they really don’t have them at all. They make these capricious decisions and cave to pressure, and when they are asked why, they say ToS as though it’s some kind of magic word. Those are no rules at all.

We all want consistent, fact-based application of ToS or no ToS at all (and indeed, what we have now is no ToS at all). Until we can get regulation, perhaps class action lawsuits or the threat thereof will convince companies to beef up their governance when it comes to these things.

If companies have no intention of upholding ToS consistently, then they are entering into these agreements in bad faith. And we need to let them know that we know that.

5 Likes

I was trying to think of an analogy related to environmental stuff. It’s a bit convoluted, but in that arena, corporations succeeded in making us all think pollution was a matter of “personal responsibility.” They were free to make or facilitate the making of toxic materials and smearing them all over the planet in the name of capitalism and profit, but the onus was put on us end users to “do the right thing.”
It’s bullshit, and until we have laws and regulations that hold them accountable, there will be little improvement. Good news is, that’s happening.
But in the meantime, we still try to hold corporations accountable from the grassroots level.
I have absolutely no moral qualms about NAZIs being deplatformed by corporations for as long as it takes,* but I also agree the longer lasting and better solution is to get cracking on those laws at the federal level.

*I also don’t have to deal with the stuff @orenwolf is describing about that approach being weaponized by bad actors, though.

11 Likes

While I agree that what Twitter and Twitch has is barely ToS at all, I don’t agree with the idea of “no ToS at all.” I want ToS that are fairly and consistently applied to everyone as impartially as possible. I do not want an unmoderated free-for-all; if I did, I’d be on 4chan, 8chan, or KiwiFarms, and not here, where the Guidelines protect us as much as possible.

It would be incredibly difficult to implement a decent and effective moderation system on hosting companies or large-scale social media platforms, but I don’t believe it’s impossible, especially considering the money all those businesses rake in annually. Convincing their bosses that it’s necessary is the real challenge. I’m not sure what it will take to make that happen… but the campaign that forced CloudFlare to drop KiwiFarms is a good start in that direction.

11 Likes

Looks like it’s now more than Cloudflare giving them the boot.

*thread

18 Likes

Prosecutions are coming for some KiwiFarms users.

*thread

17 Likes

Every domain name registrar that I’ve dealt with in the last 10 or 20 years has offered to obscure domain registration contact info from the public. It was implemented to protect site owners from spammers who would crawl the dns database and collect the contact information. Though law enforcement should have been able to get the information easily. Was Cloudflare not cooperating with them?

1 Like