Big Tech is deleting evidence needed to prosecute war crimes, and governments want them to do more of it

Originally published at: https://boingboing.net/2019/05/09/block-it-all.html

1 Like

I don’t mean to be picky here, but I think the point is not that they are deleting the raw evidence as much as they are censoring the presentation of said war crimes. Or using faulty tools to prevent such postings on their platform.

This is in return evidence that we are relying too much on a mere handful of websites and content disseminators, instead of spreading stuff out on as many hosts as possible. And how hard it is to discern between a terrorist group’s bragging propaganda and a small indie reporter’s exposing of war crimes.

Sadly, what may need to happen is that these digital records may need to be archived offline, but in secure locations for future reference. Specifically in war-crimes trials.

Hell, if those losers who kept playing the ‘mole’ in the game of Whack-a-mole between the New Zealand government trying to get the mosque shooters’ videos off-line and the murderous bigots supporting them can keep copies of that horror for their own use, why can’t the tactic be used in support of justice?

3 Likes

I guess it depends on how accurate the algorithm is. I doubt that facebook, etc and going to spend cash on employing actual, discerning humans to check if something is pro or anti terror.

2 Likes

I suppose the algorithmic solution would be to flag a post as requiring approval, much like the BoingBoing BBS can occasionally do. But the sheer size of that slushpile would be almost impossible to manage, really. And the poor schmucks who already have to review flagged content are already traumatised by the graphic violence they are forced to look at already, so their patience is not going to be high.

In the end, it’s much like the old question of handing out fliers or using the cork board at a mall or similar public-place-in-private-hands: how much free speech must the owners tolerate?

3 Likes

I think that’s the fuzzy part of the article, conflating “taking offline” with “deleting”. The evidence is most likely still there, the original uploader has most likely just uploaded a copy, and when a copy is removed from a website it still lives for a certain time in the backups and history, as well as in browser caches until it fades.

3 Likes

That’s a good point. There must be a limit supply of people who are resistant to images of real violence, but who are also not supportive of images of real violence.

2 Likes

Anything removed for copyright or content violations should be de-linked but retained for review by law enforcement and legitimate research purposes. This isn’t rocket appliances.

1 Like

There may be people “out there” who recognise the perpetrators, victims and locations depicted. If the material is restricted to a few official investigators, they will not get this information.

1 Like

Stories like this make me wonder: Why do we think people have the right to upload anything, anywhere, for immediate broadcast? YouTube should have to review what it puts up, and if that’s impossible for humans, then maybe don’t let people upload as much crap video as they like.

Again: The expectation that there be no moderation whatsoever when you upload to a publisher/platform is idiotic, unsupported and bad for the culture.

This topic was automatically closed after 5 days. New replies are no longer allowed.