I hate to go this way, but I think the first thing to do is disarm or disable whoever is trying to force you to make such a choice. Then, form two teams and rescue both groups at once. Or come up with a third plan.
What if the âpeopleâ are corporations?
My answer is mu. Unask the moral dilemma.
The type of person who would actually have to make such a decision would know more context or at least would have previously received training or some form of preparation for the fact that they might eventually have to make such a decision. A random person on the internet will likely never be put in such a situation, so the moral implications of their answer are ultimately meaningless unless you can reliably extrapolate how they would act in a situation that they might actually encounter.
Go with the third plan; thatâs always my preferred choice. Or, âWhat would the Doctor do?â
Failing that, though, I picked option 1, because I could not bear the thought of rescuing no-one, especially at such a high probability.
Though, again, I think this scenario misses something about the way we assess real life probabilities on the fly. Nobody knows for sure what the odds are, or what all the possible outcomes of a situation are. In real life, youâre standing outside a burning building with children inside, you have training in fire-fighting and rescue, you have equipment, you have experience, you have the anecdotal experience of your colleagues: you mash all these up together to arrive at a solution that will save as many lives as possible â you hope.
But you donât know the odds because you never have perfect information. Maybe the kids on the second floor are already dead from carbon monoxide poisoning, maybe thereâs a rotten floorboard that will give way as you approach the second floor, maybe the wind will shift and start blowing the flames away from the second floor. A good chance or a poor chance, thatâs all you can guess at.
My strategy would be to rescue half the miners, then divert the trolley from that other hypothetical down the mine shaft to try to save the others.
The purpose of moral dilemmas like this is to get us to agree to something that seems sensible in theory, and then get us to put the theory into practice in a situation where the theoretical situation appears to be analogous to the current situation, but is not actually meaningfully similar to the current situation. Itâs a great technique for getting people to agree to go to war, for example. But letâs not pretend that this is a moral question. Itâs not. Itâs a propaganda tactic that weâve heard so many times that we have internalized it, and actually believe we use it to draw sensible conclusions.
Ever wonder why we do so many sensible-sounding things that wind up turning out badly in ways that are obvious in retrospect? This kind of reasoning is why.
What these abstract problems do is put pressure on that intuition. Why is letting someone die to save 5, and succeeding, worse than trying to save them all and failing?
Our intuitions might be irrational.
I fear that movies and popular fiction have conditioned us to try and save everybody even when the probabilities donât make the numbers a wash. People write stories, people want to see stories of valiant success against the oddsâŚSo in our heart of hearts weâre going to believe in success.
Of course the query as presented assumes perfect knowledge of the odds. But look at captain Sullenberger and the âmiracle of the Hudson.â He made the decision to make a risky water landing rather than put additional people on the ground at risk by trying to turn back to the airport and possibly not making it. It was a decision that he made in seconds. How would we view that decision if the plane had broken up and sunk into the river killing nearly everybody like the Air Florida crash? How would we feel if he had crashed into a neighborhood? In principle the choice is the same, although the people on the ground are not a âsunk cost.â He chose to put fewer people at risk rather than pursue the perfect (passengers and plane saved) solution.
Yeah, I read this and immediately thought âoh, goody. Yet another Ticking-Time-Bomb problem.â
If you havenât already, I strongly recommend the article âHypothetical Torture in the âWar on Terrorismââ by Kim Lane Scheppele. The tl;dr is that in the specific, narrow case of the TTB as it is stereotypically presented, torturing the subject makes sense. However, and this next bit is critical, the specifics of the TTB are so far removed from actual life that agreeing to torture in the case of the TTB is meaningless. In the real world, torture is always the wrong call.
Same with this miners/babies thing. Pick whatever you like and justify it however you want (I say: let fifty miners die then loot their bodies while everyone else is outside celebrating. With the babies, tell the parents youâre only gonna rescue 50, and hold an auction to see who ponies up the most in ransom. âDonât you love your baby? Well now, maâam, youâre going to have to do a bit better than $256.45. Look, thereâs an ATM over there - go drain all your bank accounts, and then weâll talk.â), because it really doesnât mean anything. The example is SO specific, SO known, and SO devoid of context that it is utterly divorced from reality, and your answer, any answer, is meaningless.
Iâd opt to rescue the miners with a tactical operation using 100 horse sized ducks. It only makes sense.
One miner. One horse sized duck. Everybody wins.
The unspoken assumption there is that our moral reaction is linear in the number of deaths. Iâm unconvinced that it is. Are 100 deaths exactly twice as bad as 50? The death of the Earthâs whole human population seems like it would be more than twice as bad as the death of half.
When you have a moral calculus for which 100 deaths are more than twice as bad as 50, there would be a skew towards the 1st option.
Also, I think that people mostly choose option 2 because they donât really believe the setup of the problem - not at a gut level. In real life, you might go in with the intention of saving everyone but be forced to reevaluate as things go.
Kobayashi Maru - do what Kirk didâŚcheat and rescue them all
Statistically, it does not matter as far as the miners go. However, for the cause of mine safety, 100% certainty of getting some information about what caused the disaster makes a surety of saving 50% the proper choice in that case. In the case of the babies, risk, information, liability, emotional torment, etcetera will be ruinous either way, so whichever way the rescuers are leaning is the proper choice. Barring that, flip a coin.
Thank you. Why is this so hard for people? Does no-one ever sit down and think about morality, and instead just come up with random-ass rationalizations for whatever strategy best mimics the last action movie they saw?
All these moral dilemmas are hypothetical. As fictions, this is exactly the right thing to do: cheat. All you need is a scapegoat or antagonist force to throw our heroes against.
GoldenRodâs split second look up, breaking the 4th wall, is very unnerving for some reason.
Stalin was right. One death is a tragedy. A million deaths is a statistic.
We can watch the WTC collapse to Yakity Sax and laugh it up. However, I think more people would have a problem with watching WTC jumpers scored to Yakity Sax. (Oh if only BB was 4chan!)
Or. Pipe sleep gas into the miners - then they lose their cognisance and you can do whatever you like.
They maybe mathematically the same, but the moral dilemma is completely different. Morality isnât an actuarial exercise.
Except the guy who has to feed and clean up after those monster ducks afterwards. That guy thinks youâre a dick.