A job I was looking at had a requirement that read: "Inability to stop thinking about the two envelopes problem unless you’ve truly come to peace with an explanation you can communicate to us". So I thought I'd post my explanation for the problem.
The setup to the problem goes like this:
You have two indistinguishable envelopes in front of you. They both contain money, but one envelope has twice as much money as the other.
You get to choose one of the envelopes to keep. Since the envelopes are indistinguishable, you have 1/2 chance of having chosen the one with more money.
But now, after you've picked an envelope but before your choice becomes finalized, you are given the opportunity to switch to the other envelope. Should you make the switch?
Now, one sensible and easy reply is to say that you shouldn't bother. The envelopes are indistinguishable and you have no idea which one contains more money. Your chances of getting the bigger payout remains 50-50 regardless of your choice.
But now, a wild statistician appears, and makes the following argument:
"Let's say, for the sake of argument, that the envelope you have now contains $20. Then the other envelope might contain $40, or $10. Since these two possibilities are equally likely, your expectation value after switching would be half of their sum (0.5*$40 + 0.5*$10), or $25. That's 25% more than the $20 you have now.
But if we think about this more, the initial choice of $20 actually doesn't matter. You can make the same argument for any possible value of money in your envelope. You'll always gain 25% more on average by switching. So, even without knowing the amount of money in your envelope now, you should switch."
Impressed by the wild statistician's use of numbers and such, and figuring that even if he's wrong you would at worst break even, you decide to make the switch. But then, as you're about to finalize your decision and take the new envelope home, the statistician repeats exactly the same argument, word for word. "Let's say, for the sake of argument..." He's now urging you to switch BACK to your original envelope. After all, the two envelopes are indistinguishable. If there is a rational reason to switch the first time, the same reason must equally apply for switching the second time. But at this point, it becomes obvious that if you continued to listened to the wild statistician, you would do nothing but switch the two envelopes for all eternity.
That can't possibly be the right choice. Now, here is the real two envelopes problem: something must be wrong with the wild statistician's argument - but what exactly is the nature of his error?
The solution to the problem goes as follows:
If we start by assuming there's $20 in your envelope, it is NOT equally likely that the other envelope contains $40 or $10. This is where the wild statistician goes wrong. In general, given a value x in your current envelope, it is NOT equally likely for the other envelope to contain 2x or x/2.
Before we get more mathematical, let's examine the problem intuitively, by grounding it in a solid example. Say that you're on a television game show, and you're playing this two envelopes game. You know that American TV game shows typically give prizes from hundreds to tens of thousands of dollars. Now, if the host of the show lets you know that your envelope contains $50, should you switch? I certainly would. I know that, given the typical payout of TV shows, the two envelopes were more likely set up to contain $100 and $50 rather than $50 and $25. The two probabilities are NOT EQUAL.
Oh the other hand, imagine that you're a high school statistics student, and your teacher is playing this two envelope game with you for a class lesson. Your envelope contains the same $50 as in the previous example. Should you make the switch? No way. You seriously think your teacher put $100 in the other envelope to give to a high school student, for a single lesson? If your teacher has 5 statistics classes, he stands to lose up to $500 on that one lesson - likely far exceeding his pay for the day. It is much more likely that your teacher chose $50 and $25 for the values rather than $100 and $50. Again, the two probabilities are NOT EQUAL.
Now, if the two probabilities were equal, then the wild statistician would be right, and you should switch. And you should continue to do so as long as the probabilities remained equal. But the problem described by that situation is not the two envelope problem. It's actually a 50-50 bet where if you win, you double your money, but if you lose, you only lose half your money (compared that to most casino games, where you lose your entire bet). If you find a game like that, you should continue playing it for a very long time.
But for the two envelope problem, the chances of either doubling or halving your money are generally not equal. This will be true for ANY reasonable probability distribution of possible values of money in the envelops. "Reasonable" here means that the probability distribution must sum to one, and that it must have a finite expectation value. Consider any of the following probability distributions (or any other reasonable distribution you wish to think up) for the money in the envelopes:
The orange line the probability distribution for the smaller amount money in one of the envelopes. The green line is the probability distribution for double that value, in the other envelope - it's been stretched horizontally by 2 to represent the doubling, and compressed vertically by 0.5 to keep the probability normalized. You see that the two probabilities are equal (where the lines cross) only for very rare, special amounts of money. In general, if you see a small amount of money in your envelope, you're more likely to have the "smaller" of the two envelopes, and if you see lots of money, you're more likely to have the "greater" of the two. You should be able to understand this intuitively, in conjunction with the game show / statistics teacher examples given above.
Whether you should switch or not depends on the expectation value of the money in the envelopes. If the amount in the "smaller" envelope is A, then the amount in the "greater" envelope would be 2A, and the expectation value for choosing them with 50-50 chance would simply be 3A/2. Since the envelopes are indistinguishable, this is in fact the expectation value of choosing either one, so it doesn't matter which one you choose. This is nothing more than the original, simple argument presented at the very beginning.
However, what if the wild statistician insists on putting the problem in terms of expected gain conditioned on the different possible values of the money in your current envelope? This is how his original flawed argument was framed. It's an overly complicated way of thinking about the problem, but shouldn't we also be able to come to the correct solution this way?
We can. (Beware, calculus ahead) Let:
x = amount of money in your current envelope,
f(x) = probability distribution of the money in the "lesser" envelope, and
g(x) = probability distribution of the money in the "greater" envelope.
Then f(x) can be completely general, but g(x) = 0.5 f(0.5x) due to the stretch/compression transformations. Also, the overall distribution for the amount in your current envelope, given that you chose one of the two envelopes with equal chance, is:
p(x) = 0.5( f(x) + g(x) ).
Then, the expectation value for switching is given by the following integral:
Expectation value for switching = ∫ E(x) p(x) dx
Where E(x) is the expectation value of switching when the money in your current envelope is x. This is given by:
E(x) = x * p("smaller" envelope|x) - 0.5x * p("greater" envelope|x)
That is to say, upon switching, you'll gain x if you currently have the "smaller" envelope, but lose 0.5x if you currently have the "greater" envelope. Furthermore, the p("smaller" envelope|x) and p("greater" envelope|x) values can easily be calculated by the definition of conditional probability as follows,
p("smaller" envelope|x) = 0.5 f(x) / p(x),
p("greater" envelope|x) = 0.5 g(x) / p(x)
noting that the numerator corresponds to getting a specific envelope AND a specific x value.
putting this all together, we get:
Expectation value for switching = ∫ E(x) p(x) dx =
∫ (x * 0.5 f(x)/p(x) - 0.5x * 0.5 g(x)/p(x)) p(x) dx = 0.5 ∫ x * f(x) - 0.5x * g(x) dx =
0.5 ( ∫ x f(x) dx - ∫ 0.5x g(x) dx )
∫ 0.5x g(x) dx = ∫ 0.5x 0.5 f(0.5x) dx = ∫ 0.5x f(0.5x) 0.5dx = ∫ u f(u) du = ∫ x f(x) dx
Where we used a u-substitution and took advantage of the fact that the integral goes from 0 to infinity in the last two steps. Therefore:
Expectation value for switching = ∫ E(x) p(x) dx = 0.5 ( ∫ x f(x) dx - ∫ x f(x) dx ) = 0.5 * 0 = 0
So there is no expected gain or loss from switching, which is the same conclusion we reached at the very beginning.
You may next want to read:
The intellect trap
Basic Bayesian reasoning: a better way to think (Part 1)
A common mistake in Bayesian reasoning
Another post, from the table of contents