Originally published at: https://boingboing.net/2019/06/26/less-than-1-week-after-florida.html
…
Gotta say, given the name, the League of Cities is pretty disappointing in action.
This is dangerously close to becoming a pattern!
A decent backup policy with tested restores would have cost them less than it will cost just to decrypt the files with the decryption key given by the ransomers.Their entire IT staff should be fired.
What is this “IT staff” of which you speak? – Florida town managers and mayors, apparently
@xeni - Riviera Beach, not Riviera City.
I’ll insert the same Kipling quote as I did less than a week ago, because the truth cannot be too oft repeated:
And that is called paying the Dane-geld;
But we’ve proved it again and again,
That if once you have paid him the Dane-geld
You never get rid of the Dane.
This seems like a spectacular opportunity for a managed solutions/cloud providers to swoop in and pick-up a bunch of smaller cities as clients. Too bad I’m doing 12 other things right now.
The last thread compared this to a Danegeld. That is a lovely way to talk about extortion. But the metaphor holds up.
They need to kick that Bison’s ass so hard!
It would be good for society if I hacked a couple of cities, made ransom demands, then did not give them a decryption key after paying the ransom, right? Like, that would take away the shaky trust that these cities have that the hackers can be trusted to give up the keys once paid, so cities would not pay, so the hackers would have no incentive to do the hacking in the first place. Plus, I’d get a cool half million dollars to blow.
Somebody talk me out of this.
i still lament the nsa. assembling the best hacks to weaponize them instead of finding ways to protect against them.
but what are constant corporate data breaches, over collection and over retention of personal data, and the hijacking of hospitals and cities good for if not showing off how governments are failing the cyber.
You are assuming that the IT staff got the budget for the necessary upgrades - this is almost never true.
You’re making a huge assumption that the IT department has the resources to do this. On my last big corporate IT job the approved tools provided were so bad I went and wrote my own. Management didn’t care that what the provided was shit. Even when there was a huge data loss as a result of ineffective monitoring and backup nothing changed. Everyone on the server team wanted and could do better but we weren’t allowed to.
Odds are good that “the entire IT staff” in this case are one or two overworked city employees who never got the training or resources to do the job right.
Just in case we were wondering how WW3 will be fought.
IT Staff: You know we really shouldn’t do it this way and we need more budget for better backup and recovery
Management: Do it this way.
Odds are also good that there are IT staff who (while potentially aware of the ominous creaking in a general way/through bitter water-cooler rants) had neither authority nor responsibility in the backup area.
Basically any organization with computers (unless small enough that all these roles are one person) is going to have some PC re-imagers and general fixers in approximate proportion to their user base; some application administrators according to whatever functions beyond basic web 'n email(and potentially admins and web-content people for these, if they aren’t farming them out to a ‘cloud’ vendor) the department provides (for a municipality there’s likely a GIS focused type, probably some sort of ERP-wrangler), potentially a network position or two that does the usual LAN stuff and some of the slightly less typical RF for police and fire…
If a majority, or even a significant minority, of your IT department deserves blame in the event of a catastrophic backup failure that’s a very bad sign: that your backup operations are apparently hideously bloated compared to the systems they serve.(edit: department management, though typically not personally involved in backups, has a responsibility to ensure that the department they are managing is functional; so they would also count as ‘deserves blame’ without counting against the size of your backup squad.)
Good backups are hardly trivial; and they are ‘interdisciplinary’ in the sense that almost all areas of the IT department need to be able to help in converting their knowledge of their systems into cogent and realistic backup resource/policy requests; but they are also amenable to automation(indeed, if you find yourself doing lots of ad-hoc manual backing up that’s not a good sign) and so really shouldn’t be all that large a part of the department compared to the stuff they back up. Potentially a rather capital intensive one, depending on their appetite for SANs and such; but shouldn’t be where the serious employee numbers are.
Yup… good documentation is always needed for such a noble endeavor!
Too bad I’m doing like 12 other things right now…
- Send your Bitcoin wallet address to the cities that have already been hacked and demand they send you the ransom.
- Let them weigh the possibility that you’re really the hacker. If they do…
- Profit!