It certainly doesn’t help that consumer electronics pushers are happy to gloss over(and sometimes be outright misleading about) glaring security problems; but my impression is that a great many people think about a complex system in terms of solutions: they want a baby monitor, or wifi, or whatever to do what it is supposed to. When something they buy doesn’t do that, they are usually pretty good at recognizing it and either bodging at it until it does, or returning it.
Fewer people(and fewer still competent enough to be dangerous/useful) look at a complex system and think “Hmm, nice house of cards you got there. I wonder if it will break when I prod at it?”
It is hardly impossible to secure things in the first frame of mind(most of routine “unpatched systems are bad practice and bad practice is broken, I must fix things until we are in compliance with good practice” type IT security checklist stuff is pretty much that); but it is a great deal harder to imagine exactly how unpleasant a system’s failure modes can be if you are thinking in terms of solutions: your default worst-case scenario is either total failure or frustrating intermittent failure to do whatever it is supposed to; not something that is effectively extreme success, just in the wrong direction.
If you are looking at the situation in terms not of the problem you want solved, but of all the little moving parts grinding together in an attempt to solve it, it becomes far easier to envision assorted malicious re-purposing, dangerous assumptions, and similar.
This isn’t exclusive to tech stuff, a variety of unpleasant accidents and occupational health and safety problems end up boiling down to somebody not thinking through the fact that ‘failure’ can get a lot worse than ‘not doing its job’; but the complexity and versatility of IT gear certainly helps, as does the fact that so many security issues differ only from ‘working’ in that the wrong person is being allowed to execute programs or given access to data, so almost all the symptoms of success are still present; because the system is, mostly, doing exactly what it was built to.