Well if company provides a piece of technology that so much of national infrastructure depeds on (both private and public) yet can be easily compromised en mass from the outside, than yes nationalizing it should be a valid option open for consideration.
Think about it. There is no other piece of tech outside of IT that is in quite the similar way prone to these rapid system wide problems except maybe for power grid.
I hear ya. I can imagine that would be quite different. I wouldn’t want to have to do that…nor would I want to restore thousands of machines from backups. I wouldn’t really want to do anything with thousands of machines! Yikes!
Yeah, its not easy. Made even more complicated when the organization has multiple standard builds and internal corporate applications which must be tested against before rolling out patches. Now consider that these multiple builds/apps will come in different languages and may have to go through business UAT across lots of timezones. Oh and each country/region may have different UAT standards.
What that means to the people responsible for testing and pushing out patches is often dealing with multi-day no/little sleep as teams try to cross coordinate and do damage control.
I did not say that. You’re putting words in my mouth. I said “governmental requirements.” They should be regulated. There are no strong market forces keeping them safe, till after the fact. It’s the same situation with drugs and medical devices. Now I know a lot of people say we should return to the pure market driven model for drugs – let everything on the market, and if a drug starts killing people right and left . . . well, people will stop using them, right? Well, that’s the situation here. We’ve got billions of people using Windows – and even if they’re tech savvy enough to have secure systems, they still depend on Windows indirectly, in ATMs, hospitals, and so on. But things are going to pot because instead of continuing to work to make an older operating systems safer, the business model is to make new operating systems with more bells and whistles, so as to sell replacements (that really aren’t needed). The only thing that will force these companies to make more secure products is strong government regulation.
One place this is happening is regulation of software in medical devices, at least in the US, by the FDA. Of course, the budget for the FDA has been starved for years, and the number of technical people available to review analyses performed by manufacturers is very limited.
Honestly not my intention at all! Please forgive me if it seemed that way.
I think I understand your position better now. The question of if this is appropriate/right has been examined for many years. Bruce Shneirer was at one time similarly inclined however as I recall, the debate on this ended up concluding that while that may be one ideal, it may neither be practicable nor reasonable for a few reasons, the main one being that “secure” means incredibly different things under different circumstances.
Note also that “secure” is different than “trusted” when discussing formal models and system certifications.
As this matter is strongly off topic here, please feel free to followup by PM or start a new thread on it.