The two possibilities are not equally probable, though. Just like the two possibilities of “either I have a winning raffle ticket, or a I don’t” are not equally probable.
In my example, with no switching allowed, I hope you agree your chances of winning are 1 in 3, and opening a door before showing whether you won is just showmanship. If you agree with that, then how can that possibly square with the odds of winning being 1 in 2 just before you win? The odds can’t be both 1 in 3 and 1 in 2.
Yes, this is absolutely part of the original problem. The article posted is wrong.
Standard assumptions
Under the standard assumptions, the probability of winning the car after switching is 2/3. The key to this solution is the behavior of the host. Ambiguities in the Parade version do not explicitly define the protocol of the host. However, Marilyn vos Savant’s solution (vos Savant 1990a) printed alongside Whitaker’s question implies, and both Selvin (1975a) and vos Savant (1991a) explicitly define, the role of the host as follows:
The host must always open a door that was not picked by the contestant (Mueser and Granberg 1999).
The host must always open a door to reveal a goat and never the car.
The host must always offer the chance to switch between the originally chosen door and the remaining closed door.
The key is that every time the host will open an empty door, no matter what.
If, by your understanding, your probability of being right always jumped to 1/2 when he opens an empty door, you’d have to conclude that, no matter what you picked the first time, you’ll always magically had 50% chance of being right! How’d you do that??
Since that’s nonsense, you have to accept instead that your chance of being right remains at 1/3, so the remaining door must have 2/3 chance of being right.
I have got to figure out when a class at the university is teaching this and set myself up on a card table giving people an opportunity to play for $11 with a possible payout of $20. I would expect to make money because I would expect to payout $20 50% of the time, and people that believe the analysis that you should always switch should play in droves because they would expect to get $20 66% of the time if they switch.
I guess if anyone is really having trouble wrapping their heads around it and none of the explanations in this thread make it make sense, I’ve got one more possibly-useful thing to say: Don’t worry about it, probability is a mindfuck from top to bottom. People with PhDs in combinatorics can still have their grasp of reality shaken by this stuff and there is basically no way to tell a complex problem from an easy one.
If you ever have to actually play the Monty Hall problem for a prize, then switch if you think the game is on the up-and-up or stay if you think they are likely cheating, and don’t worry about how you can’t make sense of that.
Giving people a choice with new knowledge might change the odds. Giving people a choice with no new information does not. I knew when I picked the door that one of the doors I did not pick was a loser. Showing me that one of the doors is a loser doesn’t actually tell me anything.
And yet you maintain that it tells you that the odds that the non-picked, non-revealed door is the one that has the prize has increased. I OTOH maintain that it increases the chances for both non-revealed doors. I say that the “probabilities” are ALWAYS a statement about the knowledge of the guesser. When you change the knowledge of the guesser you change the probabilities of the remaining doors. Just as if you had added doors and new contestants, each one would be less likely to guess the correct door.
I just tried the simulator at this site and got the results below. You can try it yourself. After 21 tries (switching every time), I was at a perfect 66.67% success rate.
In my example, with no switching allowed, I hope you agree your chances of winning are 1 in 3, and opening a door before showing whether you won is just showmanship. If you agree with that, then how can that possibly square with the odds of winning being 1 in 2 just before you win? The odds can’t be both 1 in 3 and 1 in 2.
I think there are 2 different scenarios.
3 doors, you pick which one you think has the prize. 1 in 3 chance
3 doors, 1 is open and clearly has no prize. 1 in 2 chance
Then why do simulations run thousands of times demonstrate that switching wins 2/3 of the time and staying only wins 1/3? I’m honestly asking here–what do you think explains the 2/3 vs. 1/3 outcomes, if not the explanations that folks have provided for you here?
Then why do simulations run thousands of times demonstrate that switching wins 2/3 of the time and staying only wins 1/3? I’m honestly asking here–what do you think explains the 2/3 vs. 1/3 outcomes, if not the explanations that folks have provided for you here?
What do you guys think about running a simulation right here in this thread? It should be relatively easy and I’m sure we can all agree to a methodology.
Your simulation is calculating the odds based upon choosing to switch when there are still 3 options. I agree, in that scenario the odds are 1 in 3 without switching.
Are you saying that with this scenario below, there is still an 1/3 chance of selecting the correct door?
3 doors, 1 is open and clearly has no prize.