Mr Stross's argument is that the NSA's obstruction (or the US Government's obstruction via. laws that treated encryption like munitions) created an Internet (and therefore all of IT) that is broken by design.
Which is polemic. BS. Crapola.
The problem can be summed up by a variation of the golden triangle - Good + Cheap + Secure (pick any two).
IT Systems are COMPLICATED. Individual computers are COMPLICATED. Networks of computers are HIDEOUSLY COMPLICATED.
If you're old enough and have a narrow enough focus, you might recall when there were competing network technologies and even competing network designs - before TCP/IP swamped everything. From Europe and institutional technology arenas (the PTT or the government Postal-Telegraph-Telephone organizations) came ISO (and OSI) via CCITT. X.25. IBM's SNA. Dec's DECnet. And more. A lot of technologies were proposed, and a even attempted. X.400 is the OSI e-mail protocol. It never had a chance against UUCP and BITNET - much less SMTP - due to it's complexity and dependency on X.500 - which people pretty much gave up on when LDAP arrived. I don't recall if X.500 was also supposed to be an alternative to DNS either.
Early attempts to deploy technologies that tried to address complex security and access control issues usually failed due to their own complexity - and in the meantime, the early internet through the RFC process and later through the IETF kept on creating simple new things that worked and often fixed (or tried to) fix things that didn't work out as intended. (SMTP is a mess - but the X.400 solution would pretty much abandon anonymity and aliases and require everyone to have Government ids to post anything on whatever network might have evolved from that.
There is ALWAYS pressure to get something out there that mostly works, mostly good enough. Businesses are under tremendous pressure to produce distinctive products - not just put more gloss on the horse carriages they built ages ago.
The NSA didn't encourage buffer overflows. Or SQL injection attacks. Or Cross-Site-Scripting attacks. These are (largely) input validation errors - easy to make by novice programmers (and there's so MANY of them.)
There is SO VERY MUCH SOFTWARE OUT THERE. And more, and more, and more, and more.
And people want it cheaper and cheaper. But complexity is a cost of it's own (and more and more software just adds to the complexity!) Understanding the software is expensive - whether it's (free) Open Source or proprietary and requires a license fee (or subscription.)
And we also want stability - so that the foundations under us don't shift while we built OUR next big thing... but stability means that bugs don't necessarily get fixed - or are (expensively) fixed multiple times over. (I.e.: RedHat applies bug fixes, Ubuntu applies bug fixes, Apple, IBM, etc... all apply bug fixes to their variation of a piece of code.)
Yes, networks make computers (and the bugs in their software) more accessible - and more network based applications mean that there is more software and more bugs. But most of the threat has nothing to do with the NSA spying (not that the spying isn't deeply disturbing.) It has to do with the nature of project and product design - and the fact that people WILL use a product in unanticipated ways, for good and for ill - and the systemic outcome is emergent behavior.
So, my argument might be summed up as: IT is Broken By Design, but that isn't the NSA's fault - they're just a carrion eater on the side of the road waiting for juicy bits to eat.