I can’t help but be reminded of the craziness of the mortgage market during the real estate bubble.
Interestingly, as documented by Gillian Tett, the bank that invented the idea of bundling mortgages and selling them as investments with different risk levels - J P Morgan - was unable to understand how the other banks were able to create and sell such huge quantities of low risk mortgage tranches. The idea that the other banks might be engaged in what was essentially a large scale securities fraud struck them as outlandish.
I think at root here is the problem that the people who originate an idea do it to meet a need and know why they are doing it.Subsequent users, especially in small companies, are likely to cherry pick and have not experienced the problems that are being solved. They are more doing things they do not understand than engaging in stupid and destructive behaviour per se. Whereas with the mortgage market it was more like the LIBOR fixing - bankers trying to get the status of a tranche of mortgages uprated to make it more valuable, being prepared to sacrifice the long term benefits of a trustworthy system for short term profit. As usual I refer people to Taleb’s writing on the subject, especially his explanations of why bankers opt for short term goals.
Ohhhhh, technological deviance. I thought this was about me.
It’s still about you. We’re glad you showed up.This is an intervention, actually. We’re all your friends here, and only want the best for you. As such we have to remind you that it’s no longer Halloween. It’s time to change out of your “Sexy Borg” costume.
If you can overlook some of the fawning, Fast Company’s piece on Al Gore and his “Generation Investment [something]” company’s focus on sustainability combined with a focus on long-term economic growth, instead of short-term gains. Made an interesting read.
Ummmm… it’s not a costume…
IMO, without the Peter Principle in full effect throughout the tech industry (& corporations in general), helps this deviation problem along, might even go hand in hand, er hand in glove, no hand in shoe - that’s closest anyway.
Ever been at a company and (not) wondered why everyone in management seemed to be an idiot?
Once these failures are in charge if they sense someone on their level is not like them they will feel threatened & move to isolate/remove the “threat”. I’ve had C level confide in me that he had to hide his technical knowledge from his peers (in a tech company) lest they think he is not one of them.
Everything stopped being about “a greater good” (be that a company, a society or whatever) a LONG time ago - it is all about how I get mine and who I can take from. Peter & Normalized Deviance are just facilitators.
I’m suddenly reminded of a series of (propaganda) shorts (like this weenie one) the fed did selling the American people on the profit motive & how “greed = good” was compatible with their Christian beliefs.
Normalized deviance doesn’t actually explain why companies do stupid, destructive things so much as describe how people might normalize doing those things.
Stupid, destructive things often happen because an entitled principal has a competing interest in drawing out money from an operation.
The “smart” people — often codependent with the principal — may do the deviance normalizing by talking to or disciplining colleagues about “the real world” and related topics.
A term like “normalized deviance” is incomplete without an account of privatized, economic coercion.
Why not just call it rationalized greed?
Also, this book:
Drift into Failure - From Hunting Broken Components to Understanding Complex Systems
Sidney Dekker, Griffith University, Australia
ISBN: 978-1-4094-2221-1
Short ISBN: 9781409422211
The work I do, if bad code gets through to production without getting caught, you’re looking at a minimum of $100k in losses, and much more likely millions, depending on how long it takes you to catch it. I’m not going to say that this never happens, but it’s definitely possible to put systems in place to minimize it, if you have sufficient incentive. Those systems might just involve adding the mistake to the checklist that everyone has to go through before promoting their code to production, or making the committee reviewing it before hitting production stricter, or simply saying “individual x is no longer allowed to promote code within a week of leaving for vacation without somebody else going over it in detail.”
I wonder if to a lesser degree the pressure to keep coming out with new versions and “upgrades” also causes tech companies to do stupid, and sometimes even destructive, but definitely stupid, things.
I can’t tell you how tired I get of the “new=improved” attitude.
So there’s a name for it. I just called it institutional laziness. Starting at my current job, I constantly saw broken stuff. Bad software choices, standard tools implemented half-assed being more of a hindrance than they need to be.
Turns out a lot of the problem is that the people who set up the stuff promised to get it all working, didn’t for years, left and now nobody has passwords or spending authority or even time to get the broken stuff fixed.
What you’re talking about is a thing, for sure, but what the article is talking about is a different thing.
while the normalization of deviant practices in healthcare does not appear substantially different from the way corrupt practices in private business evolve and become normalized, the health professional’s “deviance” is virtually never performed with criminal or malicious intent
Stupid, destructive things happen even with the best of intentions. Really interesting stuff.
Seems like on a long enough timeline this approach will land you at reason #1 for normalizing deviancy: The rules are stupid and inefficient! Do you have any kind of process in place for pruning the checklist from time to time?
Both adhocracy and crippling bureaucracy are not the solution. Some kind of examined and monitored middle is better. If not necessarily the perfect idea.
That happened recently, and periodically someone automates parts of it, but the real point is that, if “stupid and inefficient” costs you a few days of testing, that’s still better than losing millions of dollars and months of work when you eventually discover your mistake.
Good article! There are some omissions on process control that need to be mentioned. As presented, the normalization of deviance assumes conscious agency by one actor or a small group. This may or may not be the root cause. I’m not discounting it; I am contextualizing it.
In health care, many errors are committed at the transitions of care, as in at shift changes on the floor, patient transfers from department to department, sending a patient home, or any time there is a handoff or basically what can be visualized as a new box in the flow chart. It is at these transitions that many errors occur.
In health care, there is also the system failure to iterate and seek outside perspective, just as with engineering and software like in the article. See Deming/Shewhart’s PDSA/PDCA cycle.
The Shuttle disasters often get blamed on one thing, like this article: “It was normalization of deviance!” Like in health care: “Doctors didn’t wash their hands and spread disease!” But reality on the ground is that there are multiple root causes than can be addressed, including all of the above. The field of statistical process control is rich with stuff to learn.
Some people call this a family of origin wound.
There may be a meaningful distinction to be found but probably not based solely on whether there was intent to breach a norm or law. Even in cases of overt financial fraud by a single individual, part of the challenge is likely that a community has come to regard the behavior as sufficiently normal.