This is the first of what I hope will be several posts related to the course I am giving this term on probability.

The following is a well-known paradox. You are presented with two envelopes and told that one contains a sum of money and the other contains twice as much. You are invited to choose an envelope but are not told which is which. You choose an envelope, and are then given the chance to change your mind if you want to. Should you?

One argument says that it cannot possibly make any difference to the expected outcome, since either way your expected gain will be the average of the amounts in the two envelopes (so the expected change by switching is zero). But there is another argument that goes as follows. Suppose that the amount of money in the envelope you first choose is . Then the other envelope has a 50% chance of containing and a 50% chance of containing , so your expectation if you switch is —so you should switch.

I tried this out for real in my first lecture, and the student who was given the choice decided to switch. Rather irritatingly, he got more money as a result. Of course, the second argument is incorrect, but the reasons are somewhat subtle. My purpose in putting up a post about it is not so much to invite solutions to the paradox as to see whether it prompts anyone to give me their favourite probabilistic paradoxes. (I’ve just done Simpson’s paradox, so that one wouldn’t be new.)

### Like this:

Like Loading...

*Related*

This entry was posted on February 1, 2008 at 2:04 pm and is filed under Probability. You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or trackback from your own site.

February 1, 2008 at 3:07 pm |

it’s not really a paradox, but I like the following scenario:

I toss a fair coin, until it comes up tails. If this happens at the first toss, you’ll win 1 euro; if it happens at the second toss (so the outcome is HT), you’ll win 2 euro; if it happens at the third toss (HHT), you’ll win 4 euro, and so on, always doubling your earnings every time the coin shows heads.

What is your expected gain?

February 1, 2008 at 3:15 pm |

I like that one too. I learned recently that it is called the St Petersburg paradox because it was discussed by Daniel Bernoulli when he was staying in St Petersburg. And it was formulated by one of the other Bernoullis in 1713. (All this information I got from proofs of articles that will be in the Princeton Companion to Mathematics.) I have a simple variant of it: would you pay ten thousand dollars for a never-to-be-repeated chance of one in 1,000 of making 100 million dollars?

February 1, 2008 at 4:50 pm |

Surely the Monty Hall problem should be mentioned. I think the martingale betting method is different from the “long odds” question, which has more to do with the non-linearity of utility as a function of money.

Lior

February 1, 2008 at 6:34 pm |

I’ve always been partial to: there are three doors and behind one is a prize, you pick a door but don’t open it, then i open one of the doors you did not pick and reveal there is no prize behind it, you can either stick with your original choice or switch…

February 1, 2008 at 6:47 pm |

There was a very nice discussion of the two-envelope problem in the American Math. Monthly. 111 (2004), 348–351 (by Samet, Samet and Schmeidler), where a solution is presented showing that, essentially, it is not possible for two r.v. X and Y to be such that X and Y are both independent of the “ranking” R which takes value 1 (say) if XY, with same probability 1/2.

February 1, 2008 at 7:22 pm |

I agree that in many paradoxes it is utility, rather than probability, that is involved.

Another family of not-quite-paradoxes is related to Bayesian probability, and it is especially important for doctors. A typical example run as follows: I am having a test to check for AIDS. If I am infected, the test always shows it; if I am not infected, there is one chance out of ten that the test is wrong. Only 1% of people in my nation is infected, and I did not do anything particular. If the test says I got AIDS, what is the probability of actually having it?

I am not sure, however, if this is what you are searching for.

February 1, 2008 at 7:44 pm |

OK, I’ve heard about the two envelop probeme before and obviously the expectation value argument is wrong but how can one see that it is erroneous? What is the subtle reasoning?

February 1, 2008 at 8:20 pm |

The last part of my comment should read:

“the “ranking” R which takes value 1 (say) if X> Y and 0 if X < Y, with same probability 1/2”.

February 2, 2008 at 12:40 am |

If I am not wrong, there is nothing “wrong” with the 2nd argument. If the first envelope contains $x$, then w.p. 1/2 on switching he gets $x$ more in expectation putting him at $3x/2$ which *is* the average of $x$ and $2x$;

and w.p. 1/2 he loses $x/2$ giving him in expectation $3x/4$ which *is*

the average of $x$ and $x/2$.

In both cases, his expected gain is the half of the sum of the amounts in the two envelopes.

February 2, 2008 at 2:11 am |

Dr.Kowalski, why is that a solution to the two-envelope paradox? You could have probability <50% of other envelope containing more money, but still get higher expected return from switching. There’s an example of such prior in Dieter http://personal.lse.ac.uk/list/PDF-files/envelope-paradox.PDF

BTW, is a preprint of Samet/Samet paper available online?

February 2, 2008 at 7:47 am |

I don’t know if the preprint is available online. The title is: “One observation behind two-envelope puzzles”; searching for this, I found at

http://ideas.repec.org/p/wpa/wuwpga/0310004.html

what seems to be an earlier version of it.

And I had simplified the conclusion of their work: there is no assumption that the ranking gives probability 1/2 to the two possibilities, only that the ranking is no constant (i.e. that neither X> Y nor X< Y holds all the time).

February 4, 2008 at 1:04 am |

Gnedenkko´s book on Probability(chelsea publ) presents what he calls “Bertand Paradoxes” of geometric probabilty: when asked to find the probability that a straight line will intersect a unit disk leaving a chord of lenght 1/2 or higher, he gives 3 “arguments” obtaining the answers 1/2, 1/3, 1/4. Also, S. Ross “Afirst course on probability” or something like that has interesting examples involving infinite sets: given a (huge) bag and balls numbered 1,2, … do the following:

– 1 minute before 12, put balls 1- 10 in the bag and take ball #1 out.

-1/2 minute before 12 add balls 11-20 and take ball 2 out.

……….. and so on …………..

Q1) What happens at 12? bag is empty

Q2) If the ball taken out is selected randomly? prob 1 the bag will be empty at 12.

These are more curious(to me at least) than important but helped me keep some students awake.

February 4, 2008 at 3:19 pm |

Another version of the paradox is:

Suppose you and your friend play a game. In the game, both of you check the amount of money in your wallet, and the one with more money gives all that money to the other. Now you think as follows: Suppose i have x amount

in my wallet. If my friend has >x, i gain more than x. If he has <x, i lose x.

And assume that both are equally likely to win(by symmetry). So your expectation is positive. But the same is true for him, whereas you both can’t have positive expectation as you win as much as he loses. Then what is wrong with the argument?

February 4, 2008 at 3:56 pm |

Even though I know this wasn’t supposed to be a discussion on the solution to the paradox, I can’t help it.

I do not know a lot about probability, but I think this paradox can be resolved very simply. It does not matter how (by which distribution) the sums in the envelopes have been determined. All we know is that we are given the information “there are two envelopes, one contains amount A and the other contains amount 2A”. This is something which sounds quite similar to, but is very different from the information “you may open one envelope. If you find amount A in it, you ‘ll be given the possibility to open a second one which will contain amount A/2 or 2A, each with probability 1/2”. Let us call these situations (1) and (2). In situation (1) (the one we are really considering) you open an envelope. It contains amount X and we don’t know if X=A or X=2A. Both are possible with equal probability 1/2. If we change, with probability 1/2 we go from A to 2A, or from 2A to A. No gain or less is to be expected and obviously, the same holds if we do not change.

On the other hand if we are in situation (2), where the amount in the next envelope actually depends on the amount in the first one (that’s the whole point: it does not in situation (1)!!), then naturally one has to go for the second one. If initially one has found amount x, then by iterating this procedure, our obtained amount will do a random walk on x.2^Z (Z={integers}).

February 18, 2008 at 6:26 pm |

Regarding the paradox presented, i believe its solution is, at least in this case, quite straightforward: One envelope contains $x$ money, the other $2x$. We can initially choose each with probability 1/2, therefore our expected gain is $x+x/2=3x/2$. Afterwards, when we consider switching envelopes, the reasoning proceeds thus: I don’t know how much money is in my current envelope, so the other one can either have $x$ or $2x$, each with probability 1/2. In the first case, i’m bound to lose an amount $x/2$ relative to my current expected value ($3x/2$); in the second case, i’ll win $x/2$. The expected gain, therefore, is $1/2*(-x/2)+1/2*(x/2)=0$, as intuition predicts.

Reworking the example to have amounts $x$ and $kx$ in the envelopes, with $k$ any constant other than 2, works as well.

Now, why does the analysis in the problem statement appear paradoxical? The analysis is certainly correct (refer to the calculation in the 3rd paragraph of the post); it’s just that it’s an analysis for a different problem. Consider:

Suppose i have 10 dollars. A friend offers me the following proposition: I give him my 10 dollars, and in return he will flip a coin. If it comes out heads, he’ll give me 5 dollars; tails, and he’ll give me 20 dollars (the $x/2$ and $2x$ envelopes, respectively). Now, should i accept his offer? Of course! As the calculation in the post shows, my expected gain is 5/4*10 dollars, minus my original 10 dollars, so i come out winning 2.5 dollars on average.

In conclusion, the paradox stems from applying the incorrect analysis to the problem. As we mostly try to find problems in the analysis itself (that is, we neglect to see if the analysis is really meant to solve THIS problem), but the development of the analysis is indeed correct, we get the “What the hell?” feeling.

February 18, 2008 at 6:42 pm |

Well, apparently the LaTeX thingy doesn’t work as i thought. Never tried to use it before, and couldn’t be bothered to read the instructions. So, sorry… In my post above, the $…$ things are supposed to be formulae in LaTeX syntax.

Another thing, this time regarding the paradox proposed in the reply above by mmm: It is assumed that, by ‘symmetry’, both participants have the same chance to win, thus the paradox. The assumption is, of course, wrong. You would only have true symmetry if you both had the same amount of money in your respective wallets at the start.

The relative amounts of money each player has is critical to deciding who wins, and it is implicit in the problem statement that, lacking knowledge of how much more (or less) money i have than the other person, i can assume that i occupy a central (‘symmetric’) position in the overall distribution of money contained in people’s wallets in general.

Now, that assumption is obviously at fault. If i have no money at all, i’m guaranteed never to lose. And if only few people carry more than 100 dollars at once in their wallets but i happen to have 150 right now, i’m more than likely to lose.

So, your expected gain depends on how much money you’re carrying in your wallet right now, and you are certainly aware of that amount. And even if you had no way of knowing the general distribution of money inside people’s wallets, you couldn’t claim an expected win based on ‘symmetry’. You’d have to say that your expected gain is indeterminate.

March 11, 2008 at 11:41 pm |

not sure it would be useful but

there is a nice recent book by Szekely on paradoxes in probability theory

Szekely G.J. Pcaradoxes in probability theory and mathematical statistics,

it is a large listing of paradoxes with references to literature.

April 23, 2008 at 1:55 pm |

Ok so I know the point of this was not the paradox itself but to hell with it.

The crux of the paradox lies for me in that x itself is a random variable, and we dont recognize it as such when supposedly taking expectancy to compare outcomes : after taking unconditional expectancy, which is supposed to be a number, we end up with x, which is a random variable. This is not possible, and we have been fooled.

The way it is presented, we are considering the case where $x=2a$, and say the other envelope has a 1/2 proba of having $2x=4a$. this case do not exist and we know so, hence it should not appear in our computation…

So we take an envelope, call its value $x$. the other envelope contains either $2x$ if x=a, or $x/2$ if $x=2a$

the expectancy of it value is $3/2a$. the expectancy of the other envelope is $3/2a$, there is no incentive to switch.

August 1, 2008 at 3:14 am |

[…] original post […]

August 11, 2008 at 7:58 pm |

Thanks…

December 5, 2009 at 2:04 am |

Not sure if anyone posted it above, but I think St. Petersburg paradox is a great one to do….

February 21, 2010 at 3:36 am |

Here’s a lesser known problem; superficially similar to the two-envelope paradox, but is actually not. Two people each pick a number (whether randomly, haphazardly, or otherwise); each writes it down on a piece of paper which is then placed face down. Person A, say, (who only knows their own choice, but not that of the other) has to guess whether the other number, which, to remind you, is hidden from them, is larger (or not) than theirs. The numbers, by the way, can be any two numbers; not necessarily integers, and may be positive or negative etc. The question: is there a strategy whereby the player (A) can guess correctly with probability greater than 0.5 ? In other words, is there a strategy whereby this is a game with positive expectation? Surprisingly, there is such a strategy !

October 8, 2015 at 1:26 pm |

I hope this thread is still live. Referring to the argument that “your expectation if you switch is 5/4”, you write:

“Of course, the second argument is incorrect, but the reasons are somewhat subtle.”

That is wrong.

You can see this easily by listing the possibilities. The problem is not as simple as you seem to think, but it also has nothing to do with possible prior probability distributions — something that can be easily seen by simply choosing numbers.

There are actually two parts to the problem. Smullyan gives a clear statement of one part, which involves only actual amounts (not expected values), and their modes of designation (see, for example, “Satan, Cantor, and Infinity, pp 189-192). He does not resolve the problem, but stating it clearly is the most important step.

The second part of the problem does involve expectation values. This corresponds to your “second argument”. The last part of Smullyan’s account, on p 192 gives a compelling statement similar to yours, but again does not resolve the problem.

I don’t know whether Smullyan ever figured out the solutions but, in my view, his work on the problem is the most important contribution in print because he shows how the problem splits into two parts.

By the way, your system forces me to use the name of an old website of mine, instead of my real name. You use some extremely intrusive software, at the level of Google.

June 8, 2020 at 4:34 pm |

I recently found an analysis of the Two Envelope problem comes close to my thinking and which I think is essentially correct. The article is

“The Two-Envelope Paradox Resolved” by Timothy J. McGrew, David Shier, Harry S. Silverstein. It appeared in Analysis, Vol. 57, No. 1 (Jan., 1997), pp. 28-33.

You can find it on JSTOR or Wiley Online Library if you have access and it should be in most university libraries.