Before I read the rest of Kiekeben’s essay: I’m going to say “Flip a coin.” It’s a simple way to short-circuit the alien’s mind-reading and make the challenge fair.
Assuming that it’s 50/50 chance that there will be $1M inside and mind-reading aside, it always makes more sense to take the closed box. Half a chance at $1M is much more than 100% of making $1K. Unless you’re just about to starve to death (or unless you can get a good ray-gun for $1K and steal the alien’s money).
Now, off to read.
I have had a life full of people telling me I’ll receive some Big Reward for taking the chance only to end up getting screwed, lied to, hyped, etc. These days, I tend to go with a sure thing. I would go for the sure thing of a thousand bucks rather than the 50% chance of nothing at all. The alien would have seen my choice of taking both boxes.
All the discussion I’ve ever read about Newcomb’s Paradox seems to ignore the most basic fact of the situation: that the amount of money on offer is going to change the way one thinks about the outcome, and so the problem cannot be reduced to simple economic theory and pure logic, because human emotion has to come into play (and human beings are an intrinsic part of the paradox: it’s not just abstract theory). A million dollars is a lot of money to most people on Earth, but a thousand dollars, though not life-changing, is also a lot of money.
What if both boxes had a dollar each? It wouldn’t matter all which choice you made, because the equivalent of finding a dollar on the street is not meaningfully different from finding two dollars on the street. You might as well toss a coin and render the idea of an accurately predicting alien irrelevant.
What if the transparent box contained ten dollars, and the opaque box $50 million? Philosophy becomes irrelevant: whatever you think of free will, you would be an idiot to choose both boxes. (Even if you don’t want the money for yourself, think of all the good you could do with it.)
It doesn’t even matter if the alien intelligence (I prefer to think of it as a really smart computer program which asks a variety of seemingly random questions before making its prediction, in the style of 20q.com) is right 100% of the time. If you knew that it could predict your choice with anything better than 50 per cent accuracy, even 50.01 per cent, you would be better off taking only the opaque box, always.
There is also the possibility that the alien being is lying.
I love that this very discussion is a perfect candidate for Kiekeben’s “certain state of affairs”.
In which case, I will so totally take only the closed box
You’re assuming that we can all agree that there is some amount of risk, however small.
The central debate here is whether or not we can know beyond a shadow of a doubt that the risk is zero. Gardner and Nozick believe we can, and for them taking both boxes is simply a “no-brainer”, regardless of the amounts in question.
If the alien is telling the truth about its ability to predict my behavior, then there is no state of the world where I take both boxes AND the opaque box contains money, and, there is no state of the world where I take only the opaque box and it does not contain money. The dominant strategy exists only if the alien is wrong, which is not possible by construction.
That is, the alien’s perfect foreknowledge is not consistent with thr existence of free will, and instead partitions the world into possible and impossible states. If I “choose” two boxes it is because the alien knew I would choose two boxes; the opaque box is empty. The causality runs from the prediction forward. But this is inconsistent with choice, and thus is a paradox by construction, exactly like the unobstructable force meeting the immovable object.
Angels dancing on the head of a pin.
Cause and effect must always follow the arrow of time. The idea that making one choice over another in the present, influences a past action, attempts to violate that law.
It can be demonstrated, logically, that if you assume something that’s false, you can prove absolutely everything. This problem assumes a ton of false things. Why do people spend their time on thought experiments that cannot possibly reflect reality? At least Einstein’s Gedankenexperiments had some firm grounding in real physics, instead of being made out of angels who can magically predict the future and change the past and cause people to make bizarre decisions.
Why are people still struggling to prove they are rational, when that idea was debunked, what, a century ago?
Weirdly enough, a long time ago I was hitchhiking down a long and lonesome road. All of a sudden, there shined a shiny alien in the middle of the road. And he gave me this exact paradox. I punched him in the snout and took his wallet.
The wallet was empty except for a note. It said “I knew you would take this”. Fucking aliens.
I would first take a photo, as that in itself has to be worth more than $1000. Then I would take the closed box. If the alien is really intelligent, he would have guessed correctly. If it’s empty, it was always empty, he’s not as smart as he claims and I still have the photo. If I take the $1000, there will always be the mystery of whether the other box would have had money in it and what an alien was doing posing as a game show host on Earth. On the other hand, 999 people have done this before and I’ve never heard of any of them, so he may not be as benevolent as he seems. In any case, $1000 is not worth worrying about in this context.
If you take the closed box, you’re risking a potential upside of one thousand dollars (versus zero) on a bet that the alien really can predict your future choices.
If you take both boxes, you’re risking a potential upside of one million dollars (versus a thousand) on a bet that he can’t.
There is no outcome in which you are worse off than you were to begin with.
To some extent, this is a measure of privilege: I am wealthy enough that it wouldn’t hurt my feelings even a little bit to miss out on a thousand dollar gain. However, a million dollars is a life-changing sum of money: it’s a comfortable early retirement. So it is obviously more rational for me to bet on the alien, and take the closed box.
However, if you changed the amounts in the boxes to a million and a billion dollars, even though the proportions are the same, the evaluation would change. Give me a million dollars, and an additional billion is a nice bonus, but heck, I’m already comfortably retired. I’ve probably got a nice sailboat; I can get along without a mega-yacht.
The calculus also changes if you alter my life situation: if I’m starving to death, or I need a thousand bucks to pay off a loan shark, then the sure thing becomes a great deal more attractive.
All of which is to say that this supposed “paradox” is very poorly thought out.
And yet people have been debating it for almost fifty years, with every camp certain that it has the final word, and people still can’t seem to agree on what it all means, let alone if it has a rational solution. (It’s been a while since I read Gardner’s discussion, but I think he quoted Isaac Asimov as saying that anyone who takes only the opaque box is not recognizably a human being. Whew!) Whatever its flaws, that thing has got some staying power.
Every single thing in life involves some risk, though. There’s a nonzero risk that the superintelligent predictor is going to abduct you if you make the wrong decision, or that the $1000 bill is coated in flesh-eating bacteria. For me, given a choice between an absolute certainty of $1000 or a strong likelihood — what I have been informed is a certainty, if I take a leap of faith — of $1,000,000, I’ll take the leap. Other people might choose otherwise.
There’s a difference between something being physically vs. logically possible, just as there’s a difference between something being physically vs. logically risky. Everything is physically risky, but not everything is logically risky. Physical risk is for boring old actuaries, logical risk for exciting young philosophers!
This video makes me uneasy. I get a sense of existential dread, listening to these stories and trying to imagine what I’d do if it were me. In the hitchiking story, my life is worth a thousand dollars, and do I want to risk it for that much? Wait, who writes this shit? people routinely pay much more than that, trying to save their own lives and the lives of loved ones. With far greater uncertainty than the hitch-hiker. And when a mad scientist offers me a thousand bucks, no strings attached, am I really going to pay that much attention to the other 99.9% of the potential prize, based on some abstract notion of fate? It’s not like I did anything to have earned either the thousand or the million, and I’d far rather have done something worth that much money, than merely be offered that much for no reason. Again, who writes this stuff?
So the sense of unease doesn’t come from my own struggle with the puzzle, rather, the idea that I’m surrounded by (otherwise smart) people who take this sort of thing seriously. I fear I have a much different value system than the rest of you, and it’s dangerous to admit it.
No, see, this problem is not like Scrodingers cat. The money is either there or not, but it’s not 50:50. You should be able to logically determine whether it is or isn’t, without seeing it or resorting to a coin toss.
Time flies like an arrow
Fruit flies like a banana