I'm not really sure why staying with A wouldn't be the same as going with A or B (similar to how switching to C would be the same as B or C).
I understand the math behind the problem and have written some computer simulations to show that if you switch, you do in fact win twice as much. However, if you make your second choice random of the two remaining doors, the odds of winning become 50%.
One thing logically that confuses me is my original statement that if someone else in the audience said, "I think the car is behind door A" after the first goat is shown to you (and let's assume that he was in the bathroom when you picked your original door and the first goat was shown, so he didn't see any of this information). He has a 50% chance of being correct, right? If that's the case, how can I have a 33% chance with door A and he has a 50% chance.
I'm not sure I agree with that. The probability of each door winning is 50%. You can pretty much throw out your first door's pick because it's really irrelevant. Think about this situation. I pick my door and the host shows the goat prize. Let's assume I picked door number 1. Now, someone in the audience says, "I bet door number 1 is correct." He's got a 50% chance that he's correct, but according to the theory of the problem, door number 1 has a 33% chance of winning. So two people with the same exact door have different probabilities - that just doesn't seem to make sense.
Hi, I'm still a non-believer in the solution to the Monty Hall problem. For those of you not familiar with it, it goes something like this:
Given 3 doors, 2 of them have "goat" prizes behind them, and one of them has a car. You select a door, and then the host shows you one of the other doors - the catch is that other door is ALWAYS a goat. You then have a choice to switch and take the prize behind the other door. Most articles on this problem (including the most accepted answer) say you should switch since it makes the odds of winning 2/3 and the statistics go along to prove this.
Now this makes some assumption that your second pick (i.e. whether to switch or not) is dependent on your first pick. In reality it is not (or does not have to be). I approach the problem like this. I pick a door, and the host shows me a goat. I then randomly choose one of the two remaining doors. Thus, statistically speaking, I now have a 1/2 chance percent of winning.
I'd like to discuss this theory vs the "accepted" theory.