Hackers report company to SEC after it fails to disclose being hacked by them

Originally published at: Hackers report company to SEC after it fails to disclose being hacked by them | Boing Boing

5 Likes

I think that’s the very definition of adding insult to injury.

10 Likes

I wonder if hackers short the stock prior to public disclosure of a significant breach. Probably a good way to get caught.

3 Likes

This doesn’t seem like an altogether bad thing.

Sure, in an ideal world the ransomware gangs would be taken out of the picture; but so long as we can’t be rid of them something that makes the c-suite sweat a little about their odds of getting away with failing to disclose to the SEC beats not having that additional check in place.

I just hope that whatever effectiveness the reporting mechanism has isn’t degraded by some combination of people swamping it by taking advantage of the fact that mounting a really lightweight attack or no attack at all and then threatening to report it to the SEC is even easier than actually bothering with a real attack; and by bleating from the allegedly-regulated to the effect that they need to be allowed to lie about the state of the company or the (external) criminals will win.

1 Like

just a quick plug for Matt Levine’s Money Stuff newsletter from Bloomberg. He’s had some very interesting thoughts about this story.

(I’m not a Bloomberg subscriber and you don’t have to be to be to subscribe to the newsletter.)

DAMN I am conflicted here. Do I want companies to be punished for this behaviour? 100% Do I want hacker groups, who often are associated with terrible oppressive regimes, to have another tool in their arsenal - one that the government is complicit is no less? Obviously not.

I’m just spitballing here, but it seems these kind of hacks are basically ubiquitous. It doesn’t seem any major company is going to avoid it at some point, esp. if assuming we only hear about a small fraction of actual breaches. Would a regulatory regime work where there is no penalty for breaches IF the company reports in a minimum timeframe AND certain minimum standards of practice are met, updated regularly? That’d encourage early disclosure, and if everyone is disclosing there’s less reputational penalty. The penalty for non disclosure or not meeting standards would have to be commensurately steep. Or would that be too difficult to write to cover diverse enough businessrs/online presences? Obviously it would also need to also take into account sensitivity of the data in question. I also always worry about such regulations having a tendency to wipe out smaller and new businesses that wouldn’t have the resources to meet rigorous standards, or for that matter even hire someone to ensure they do (a good example is corporate consolidation of hospitals and nursing homes in response to activity based funding models - you need a big accounting department to ensure you are paid adequately)

What you propose as relief is actually more stringent than what we actually have. The SEC requirement (once it takes effect, which it will soon) is just that you disclose a material breach within 4 business days of determining that it is material; and that you provide an annual statement on your cybersecurity practices.

There’s no penalty for the breach(at least not here; depending on what it is you could be in hot water on GDPR, HIPAA, FERPA, etc. grounds); and there’s no minimum requirement for competence. The SEC just demands that you not lie by omission with respect to material breaches; and not lie by omission or commission in your description of your risk management practices and expertise.

This topic was automatically closed after 5 days. New replies are no longer allowed.