Saturday morning mind-benders: "Newcomb's Problem" and "Parfit's Hitchhiker" dilemma

I’d pull the lever to switch the trolley onto the track the alien was standing on.

5 Likes

But in that case, the decision tree becomes a decision jungle. What is the alien lying about? The amount of money, the prediction machine, the alien is really your 2nd grade teacher, you’re the alien, you’re an android, there’s a ray-gun hidden under the table, the galactic peace treaty rides on your uninfluenced decison, the fact that it’s all a set-up for the Perseus Arm reality-quiz show hit You Bet Your Monkey where the punchline is always that the monkey in question gets eaten*?

*You’re the monkey in question, btw.

Before looking at the video, I tried to think through the problem first. After reading it several times trying to see the paradox, I drew up a cost/benefit table for the situation.

There’s only four outcomes: on one axis you have the alien’s prediction being accurate or inaccurate, and on the other axis you have your decision to take both boxes or just the closed one.

  • If the alien’s prediction is accurate and you take both boxes, then
    you gain: $1000
  • If the alien’s prediction is accurate and you take the one box, then
    you gain: $1000000
  • If the alien’s prediction is inaccurate and you take both boxes, then
    you gain: $1001000
  • If the alien’s prediction is inaccurate and you take the one box,
    then you gain: $1000 $0

So the first observation is that whatever decision you make you’ll win won’t lose anything. [Edit for lapse in memory]

The second observation is that you have no way to factor in the alien’s prediction in your decision, except to accept that the alien has been right 999 times out of 999. You might be the lucky/unlucky one in a thousand, but it’s not the way to bet. So you may as well ignore the column ‘Alien prediction is wrong’ and just pick the maximal reward from ‘Alien prediction is correct’, which is to take the closed box. You win $1000000 because you’re a human being and not an economist.

Now, I note that the million bucks is $1000 less than the maximum you could win if you convinced the alien that you were going to pick the one box but managed to fake it out by taking both, but that seems like an awful amount of work for 0.1% gain, and a significant risk that you’re not as smart as you think you are (usually a good bet). I’d be happy with the million. Easy money.

I don’t really see the paradox, unless one has an unhealthy obsession with being unpredictable or gaining the absolute maximum from any situation. Have I missed something?

1 Like

Both of these paradoxes make the assumption that it’s irrational to decline to take or keep $1000. But I don’t think it is irrational. Why is “rational”, or even “selfish” defined as “maximising wealth”? I think that’s the crux of the confusion here. At a certain stage, maximising wealth is no longer a rational thing to do. If we stopped eating, stopped having relationships, so enjoying ourselves, in order concentrate on maximising our wealth, we are not giving ourselves benefit, we are doing ourselves harm. Once we have enough money to live comfortably, pursuing more money is not “rational” or “selfish”. It’s a waste of time. So leave the $1000 in the box, or give it to the car driver. It’s irrational to be obsessed with money to the exclusion of everything else. Being selfish isn’t just about accumulating money. Maybe to someone who has a million dollars, an additional thousand isn’t even worth the energy it would take to carry the box. And who knows if I would ever spend that additional thousand dollars? If I reach the age of 100 and die with more than $1000 in my bank, then taking that thousand dollars would have wasted precious seconds of my life which I could have used for something which would give me value. If more than a thousand dollars is still in my bank account when I die, then taking that money was a net disbenefit.

1 Like

This whole situation just has scam written all over it. The ‘mark’ is being asked to give up a ‘small’ amount of money in order to secure the big pay out. Sounds just like the classic violin scam. I bet in the fairground/scientist version of this the mark had to pay $10 to go into the tent and then the second box is empty after all… The ‘first’ ever time the scientist predicts wrong :smile:

1 Like

Prescience opens the gate to a collective unconscious. Is this alien being truly alive? Is he existent in the same cosmic collective as the man making up his mind what box to take? If he is a supreme being, then does he project any form of prescient thought upon the unwitting, wanting human. Is it of any benefit to the will of this alien to give any sum of money away, or to cause despair, to a collective he may or may not empathize with?

If the alien is in fact deceased (but existant in space like Kai from LEXX say) belonging only to himself, and the prescient future of all others outcomes, he is by logic a miserable battery, unplugged from his own future because time is nothing to him, and he’s probably (75%) a cheapskate too.

Time for him is irrelevent and he knows in modern men/woman that time for them is money, necessary to sustain that what he doesn’t belong to in life as part of a prescient collective, so his desire to drag out the conundrum at as little expense as possible, to a. starve the living, and b. repeat the experiment as long as possible to alleviate his supreme loneliness, is evident.

So take pity on this visitor, go for the cool $1000.

Unless they mixed up the boxes, in which case some physicist is going to be pleasantly surprised and some alien is going to jail on charges of animal cruelty.

1 Like

You can solve it with game theory, at the moment both ‘players’ know the rules of the game, and if you can accept the premise that human behavior can be accurately predicted (which isnt that tough.)

Two moves. Two outcomes. Theres no $1,001,000 gain possible. Just $1,000,000 or $1,000 as options.

The game is the same as paying $1,000 to gamble on winning $1,001,000.

What you do depends on your utility for $1K versus the chance at $1M, and your beliefs about what the alien is capable of predicting. It’d be interesting to ask the alien how many previous players won $1M.

Hold on, I know this one. Both boxes are poisoned!

6 Likes

I love Newcomb’s Paradox. It’s asking more complicated questions about philosophy than a lot of you are giving credit. I think Parfit’s Hitchhiker is a funny near-restatement of the problem. What I don’t think people usually notice is that Newcomb’s Paradox is the prisoner’s dilemma from your side: Box B is cooperation, Box A is defection. It’s just that the payouts have been changes from the normal 5, 3, 1, 0 to 1001000, 1000000, 1000, 0, which skews the game a little. But from the Alien’s side, presumably their goal is to correctly predict you and they don’t care about the money. So their payout is more like 5 for both cooperate, 5 for both defect, 0 for a mismatch.

The normal analysis of prisoner’s dilemma is that no matter what your opponent does, you always get more points by defecting, therefore you should defect. That sounds right. But in this game, it is clearly the case that regardless of whether it is logical for your to defect or cooperate, the opponent is going to get 5, because they can do the same logic as you can - they don’t care.

So you are faced with a situation where if logic tells you to defect you get 1 point, but if logic tells you to cooperate you get 3. The question is, if logic tells you that logic is getting you less points than being illogical, then does logic tell you its time to be illogical? If so, you are doing a lot better than if not. Of course the same applies to the actual single iteration prisoner’s dilemma as well. If it were actually the right play to cooperate, then we’d all be better off, so isn’t it the right play to cooperate? How much do we owe to this system designed to produce misery?

Yeah, but we generally regard “if I knew it was going to rain I’d have brought my umbrella” as a reasonable thing to say and “if I knew it was going to rain I’d be 70 feet tall and made of steel” as a unreasonable thing to say.

2 Likes

Well you’re no fun.

But really - if I told you I’m an alien with a mind-reading ray and a million dollars to offer you, would you puzzle over the terms and conditions? Or would you laugh me out of the room?

I’ll tell you what I would do, I’d take the box with $1000 and get out of there.

1 Like

Maybe the Alien is lying about everything. Which would explain why there is still 1,000 in the open box. I would take the 1,000 and walk away. That is what the machine would have predicted I would do anyway.

Actually you’re taking the closed box either way. The decision is whether to take ONLY the closed box, or to take the closed box AND the open box. But if you take both, the alien may have predicted this, causing there to be nothing in the closed box that you took along with the open one.

If you watch the video, the lady presents the problem in a more reasonable way. Well at first she doesn’t, but about halfway through she points out that with 1,000,000 and 1,000 the mad scientist (not alien) could be right only 80% of the time or so, and the paradox would still hold. She claims that some way of brain scans, or even just extrapolating off a person’s past history, are not entirely out of the realm of possibility.

This topic was automatically closed after 5 days. New replies are no longer allowed.