Gonna plug Richard Levins' essay "Preparing for Uncertainty" in the journal Ecosystem Health (volume 1, issue 1) as a very relevant and more in-depth take on these kinds of strategies.
Do you happen to have a copy you could share? The journal's web page only goes back to Volume 3, 1997.
Both Accuracy and Resilience assume the user has resources to put towards the dealing with the outcome. Sometimes what looks like Denial is just a person of no means.
Well… I'll meet you part way and say (1) the full citation is Levins, R. (1995). Preparing for uncertainty. Ecosystem Health, 1(1):47–57, and (2) inter-library loan is your friend (at either academic or public libraries).
Best wishes, and happy new year.
Perhaps it's working in insurance that's done it to me, but whenever I read these sorts of articles I find it completely bizarre that this isn't how everyone thinks! We call it "risk management", and personally I think the more people apply those principles to everyday life, the better-off they'll be. The process is applicable to almost any event with a probability of happening or not, including in the future. (Incidentally, my favourite example of this sort of thing not being used most people is their bizarre and irrational fear of their child being abducted.)
In a nutshell, you a) identify the bad thing that might happen and what harm it might cause; b) identify the methods by which it can be prevented or its harm reduced; and c) implement those things that you consider to be cost-effective. People do parts of this process all the time, but they rarely string it all together in a way that makes any sense.
I often get called level-headed, but I don't sleep soundly because I'm a denier - it's because I know I've taken all the actions I could come up with to benefit the situation, and everything else is up to the universe.
Your average Joe is completely fucking hopeless at judging risk. This is why the US has the TSA, and you can't legally ride a bike without a helmet in my country.
I'm inclined to think this risk illiteracy suits the powers that be just fine; they're constantly manipulating people's perceptions of risk, as a means of pushing buttons in the reptile brain. On a population level, folks can be thus controlled with predictable results.
Developing an informed appreciation of risk is one of the less obvious ways of freeing your mind, with a huge payoff; suddenly you notice all kinds of authorities lying through their teeth. If enough people took that step, we'd suddenly be a whole lot closer to democracy.
The first time I encountered this distinction was when everyone was planning "earthquake kits" in the wake of Katrina hitting NOLA. The temptation seemed to be to try to throw anything and everything into the kit that might possibly be useful in a disaster. The resulting mess was dismantled and parted out in the divorce (a far more likely disaster that there seems to be no preparing for!)
Since then, daily life has been enough of a disaster, it's hard not get denialist: who wants to think about things getting worse than this?
This topic was automatically closed after 5 days. New replies are no longer allowed.