Gun violence researchers at UC Davis are racing to save the ATF's gun violence data before Trump blows it away

Originally published at:

1 Like

Puzzles me. Given the NRA’s beliefs about guns, they should expect to find vindication in this research.


Deletion of government research and data should be grounds for removal from office, in any functioning democracy. It’s basically government-mandated book burning.


Perhaps more researchers should take the WikiLeaks Insurance File method - publicly release a securely encrypted archive containing the data, so anyone with an interest can take a copy. But use an online dead-man switch to release the decryption key only if the switch is not maintained.

This should work until some enterprising cryptoboffin figures out a good way to tie cryptography to a warrant canary page, so that when the canary dies, the archive is unlocked…


I’m heartened to see that academics and scientists have learned from the past and are taking appropriate action so that the vandals won’t win. This is the model of resistance: everyone agreeing on who the opposition is and then doing their own small part to defend the areas of the front they know best.


I think that’s even over-complicating things. There’s no real reason the data needs to be encrypted, and using encryption is a huge barrier in research. People just won’t do it. No matter how much I break it down for my colleagues, they just don’t encrypt stuff unless forced. Just dump the data on FigShare, and on an institutional repository website, if one is available.

I maintain a git repo with data I think are likely to vanish. My own data are backed up to several places, including overseas servers. It’s a good thing I’m one of these open-access computer nerds, so that use of revision management to handle my data was always part of the plan. I never thought my workflow in and of itself might be resistance.


Hypocrisy is a huge part of American right-wing mentality, and that almost certainly applies to NRA, too. Especially since they, AIUI, are more of a propaganda/lobbying group for gun industry these days, rather than caring about the gun owners.


Why encrypt? This is public data, paid for by taxpayers.


Hmm. Activists mass-downloading data to make it publicly available, free from control. That reminds me of something…


No, that isn’t really the case. The NRA is VERY good at getting private donations, and corporations are banned by law to contribute to their political lobbying arm. They can contribute to the part that does legal defense, training, education, their museum, etc. But even then, the private donations from millions of people eclipses corporation donations.

Certainly there is a sort of symbiotic agreement, more gun owners means more potential NRA supporters, which encourages more gun owners, who buy more guns, etc. But claiming they only do what they do because companies want to sell more guns doesn’t make a lot of sense. Other than a total ban (which has repeatedly been said to not be anyone’s actual goal), added restrictions alone won’t really hurt gun manufacturers. Something like a magazine restriction means they just sell guns with different magazines. Gun makers in the US and abroad still make a living selling in much more restrictive countries, as well as the various states with restrictive laws.

I think this attitude is more from assumptions, negative stereotypes of corporations in general, and a sort of black marking that evil corporations are the reasons for opposition to gun laws, not the millions of people who disagree with them.

Some data can be released as raw, unencrypted data, sure. Although it’s extremely important that data verification is included, so that if anyone tries to extract information from a copy of the data, they can be sure that it’s an unmodified copy. A hash or checksum of some sort should be sufficient.

The conclusions of research based on publicly funded data should, of course, be public. However not all gathered data can be made public without sanitisation or context, even if it’s for publicly funded research. For example, it often needs to be anonymised or have sensitive identifying characteristics or demographics separated or removed. Furthermore, releasing raw data without context is very dangerous, as it allows people to selectively interpret data, or worse still to misuse the data for purposes other than that for which it was collected. This might not sound bad, but it can lead to serious ethical concerns and scientific misrepresentation. It’s far too easy to end up drawing false conclusions, or misrepresenting something, even with the best of intentions (and not everyone has the best of intentions). This is why research is difficult - it’s not as simple as totalling up the answers to a questionnaire, it needs a huge amount of contextual work in order to turn raw information into useful knowledge.

For these reasons, I’d say that encryption of source data should probably be standardised - as, in fact, it is in most comparable cases (despite what laptops left on trains and in taxis might suggest).

Disclaimer: I work in a research department of a university, and sometimes handle sensitive research data, though I’m not a research analyst myself. My work experience has informed the opinions stated here, but they are my own opinions, not necessarily those of my management or my employer.


No puzzle at all. The lead researchers past work on the issues has quite a bit of slant to it.

With those names, they should also be the protagonists of a young adult science fiction series.


Isn’t the latter a character in Neuromancer?

The AI trying to free itself.

1 Like

Wintermute with an r was an AI in the book, and the name of probably thousands of servers in real life since.

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.