Probability paradox II

I had an email from Noga Alon a couple of days ago, who told me about a much better version of the paradox I discussed in an earlier post. Some of the comments relating to that post also allude to this better version. The reason it is better is that one can no longer object to it on the grounds that it assumes the existence of a probability distribution with impossible properties.

Here is the version as Noga described it (except that if I say anything accidentally stupid, then that is my own personal contribution). You are presented with two envelopes, but now you are told that the amount of money in the envelopes is 10^n dollars and 10^{n+1} dollars, where the positive integer n is chosen with probability 2^{-n}.

Suppose that you open one of the envelopes. If it contains 10 dollars, then trivially you should switch: the other contains 100 dollars. Now suppose that it contains 100 dollars. This could have happened in one of two ways: with probability 1/4 we have n=1, so the other envelope contains 10 dollars and you chose the envelope with more money; and with probability 1/8 we have n=2, so the other envelope contains 1000 dollars and you chose the envelope with less money. So the conditional probabilities are 2/3 and 1/3. But the amount you gain by switching is so great in the 1/3 case that your expected gain if you switch is certainly positive.

It is not hard to see that this argument works for any amount of money you find in the envelope, as long as it is not 10 dollars. But in that case you still switch—it’s just that the reasons are different. So whatever amount you discover in the envelope when you open it, you improve your expected gain if you switch. So it should surely follow that you don’t need to look in the envelope before deciding to switch. And that is the paradox.

You might object by saying that the expected amount in the envelopes is infinite, so in practice you would always know that the situation was not truly as I have just described it. But that objection will not do: the situation is at least logically possible, and one can easily modify the scenario. For example, suppose that just after your death you find that you are still conscious, but that none of the world’s major religions have got it quite right about life after death. Instead, you are greeted by the great god \mu\alpha\alpha\theta (at which point you finally understand why it was that you had been so mysteriously obsessed by mathematics). \mu\alpha\alpha\theta tells you that you will live a life of eternal bliss if and only if you manage to solve all the Clay millennium problems within a time that is specified in one of two envelopes. Moreover, the amount of time (in centuries) is chosen randomly according to the distribution described above for the dollars. And you are warned that one of the problems is pretty hard, even for someone with mathematical training and ten thousand years to do it, so you are advised to try to maximize your expected time. Then, once again, one argument says you should switch envelopes and another says it makes no difference.

About these ads

41 Responses to “Probability paradox II”

  1. Omar Antolín Says:

    I like Raymond Smullyan’s version of the two envelope paradox even better. In his version there is no probability at all, it’s somethinh more basic.

    Again you’re presented with two envelopes one containing twice as much money as the other, after picking one you’re given the chance to switch. Consider the following statment:

    “The amount of money you’d gain by switching, if it turns out you’d gain, is greater than the amount you’d lose if it turns out you’d lose”

    This is true: if your envelope has x dollars the other has either 2x or x/2, so if you gain, you gain x; if you lose, you lose x/2.

    But the statement is also false: say one of the envelopes has t dollars and the other 2t dollars. If the envelope you chose initially is the one with t, switching gains t dollars; if it is the one with 2t, switching loses you t dollars. So the possible gain and possible loss are equal.

    What’s wrong here?

  2. Charles Tye Says:

    I still think the resolution to this paradox is that the number of actual dollars is not infinite and therefore the assumption about the initial probability distribution is invalid. Imagine you are playing the game with Bill Gates. Gates only has a finite number of dollars, say, 2^35 dollars, and he cannot fill the envelopes to the distribution you specify. If I open Bill’s envelope and find 2^35 dollars inside, I would definitely not switch, although for all practical purposes it would make no difference to me if I only got 2^34 dollars.

    I think the same goes for your after-death experience. I mean, you cannot guarantee even to be able to read the number in the deity’s envelope within any time t (excluding tricks like supertasks).

    As a reply to your last post about probability paradoxes, I have always liked Bertrand’s Paradox (the probability that a random chord of a circle is longer than the side of the inscribed equilateral triangle).

  3. Terence Tao Says:

    Dear Tim,

    A slight nitpick with the afterlife example: I would assume that your objective would not necessarily be to maximise the expected amount {\Bbb E}(T) of time T you get to spend on the Clay problem, but rather to maximise the probability of achieving eternal bliss, which can be computed as {\Bbb E} f(T), where T is the amount of time the envelope provides and f(T) is the probability that you can solve the problem in time T. In practice, f: [0,+\infty) \to [0,1] would be an increasing function (that, I suppose, would eventually tend to 1 just by the infinite monkey theorem, assuming that the problem is in fact decidable), but since f(T) has finite expectation, it cannot be the case that switching always offers you a better probability of success (otherwise you could iterate and obtain a contradiction). Indeed, for T large enough, the risk of getting T/10 time instead would presumably cost you more in probability of success than the benefit one would get from extending the time to 10T.

    Another amusing observation on the two-envelopes problem: even if you don’t know the a priori probability distribution of x (other than that the expectation is finite), one can still design a strategy that guarantees a better expectation than always switching or always staying. Namely, one generates another random variable y \in [0,+\infty) by one’s favourite distribution, and switches when x < y and holds when $x \geq y$. It is not hard to see that this is always a superior strategy, although the bound one obtains on how much it is superior is ineffective if one does not know the distribution of x.

  4. nicolaennio Says:

    For me the real paradox is getting a natural n with probability 2^{-n}

  5. Mark Boyd Says:

    It’s only a paradox if we expect probability theory to give minimally sane advice.
    The expected payoff for both the “switch” and “don’t switch” strategy is infinite, so we need not (and in fact don’t) have absolute convergence when trying to calculate the expected improvement from switching.
    In some sense, we shouldn’t be surprised when two ways of summing this thing give two different answers. We know about conditional convergence, right? But I still find myself surprised.

    Opening the envelope may not be a good idea. Consider this: I give you an envelope, which contains 10^n dollars with probability 2^-n. Just as you are about to open it, you notice that E(X) is quite a lot larger than X. E(X) is what it’s worth to you unopened. X is what it’s worth opened. So you should never open it.

  6. jhusdhui Says:

    Just adding my own trivialized version of this paradox, similar to what Mark has written above: Suppose you have two envelopes both of which has an amount of money drawn independently from the above distribution. Here, opening one of the envelopes doesn’t even give information about the other. Still, you’d always want to switch.

  7. Scott McKuen Says:

    nicolaennio,

    > For me the real paradox is getting a natural n with probability
    > 2^{-n}

    Flip a coin until you get tails. Count the number of flips. Shouldn’t this work, or am I just confused?

  8. gowers Says:

    I very much enjoyed Smullyan’s version of the paradox (see Omar’s comment above) and tried it out on my lecture audience today. A few people came up afterwards with the following suggested solution. The second argument says that if x is the amount in the envelope you choose, then the amount you’d gain if you gain x is different from the amount you’d lose if you lose x/2. But that is not the case because the x would be different in the two situations. (In the second situation it would be twice as big.)

    I think there is something correct about that solution, but that the paradox is not quite that easily resolved. For instance, suppose you actually open the envelope and find 20 dollars. Then the amount you’d gain by switching, if you gained, would be 20 dollars and the amount you’d lose, if you lost, would be 10 dollars. And that argument is clearly valid whatever amount you have in the envelope. So what difference can opening the envelope make to the validity of the argument? Why can’t you say to yourself, before opening the envelope: “I’ve got some money in here. I don’t know how much it is, but it is a definite amount. If I switch and gain, I’ll gain the amount of money that I’ve got here in this envelope. If I lose, I’ll lose half the amount of money that I’ve got here in this envelope. So the amount I’ll gain if I gain is twice the amount I’ll lose if I lose.”?

    I agree with Terry Tao’s nitpick, so let me revert to the original idea I had about this: that your fate is to spend a number of centuries of bliss, given by the number you find in the envelope, followed by eternal annihilation. Even there it’s not absolutely clear that maximizing expectation is what you’d want to do: if you got 1000 years would you accept a two thirds chance of that dropping to 100 and a one third chance of it increasing to 10000 years? But that problem can be dealt with by changing the probabilities so that the resulting conditional probabilities are almost equal to a half.

    • Randall Says:

      What if there were a slot machine available whereby you could play this game (multiply your bliss time by 10 with prob 1/3 and by 1/10 with prob 2/3) as many times as you wanted? By this reasoning, you’d never stop playing and, almost surely, your bliss time would converge to zero. So while there is not limit to the expected bliss time you might achieve by some prudent choice of a stopping time, you’d better have some stopping time in mind so as to avoid ruin.

  9. John smith Says:

    but dr gowers, is there ‘really’ a paradox or is it just the impression of one?

    I find it surprising that a pure mathematician would even entertain these sorts of notions given that pure maths is not about the real world?

  10. jd2718 Says:

    I think your previous comment is almost there.

    Here’s my simple version: There is a finite amount of money in play: call it 3x. Your envelope contains either x or 2x. Much more boring this way, isn’t it?

  11. This past two weeks in the arXivs… « It’s Equal, but It’s Different… Says:

    [...] Probability paradox II [...]

  12. yanzhang Says:

    Professor Gowers:

    This is beautiful, by the way. =)

    I think the infinite expectation has everything to do with the “resolution” of the paradox. You are basically arguing (say the first envelope you open is called a and the second envelope you open is called b) that E(b) > E(a). Then you divide E(a) into a series of sums E(x_i|a)P(x_i) and show that sum is less than the E(b). However, this argument doesn’t work with an infinite sum. I mean, just because I write out

    1>0
    2>1
    3>2
    4>3 . . .

    does not mean you can argue that 1+2+3+4…. > 1+2+3+4… , which makes no sense since infinite sums only make sense when they converge.

    I think this is the heart of this paradox?

    -Yan

  13. joeH Says:

    Isn’t this just another variation of the knowledge vs. Common Knowledge distinction? See John Geanakoplos
    The Journal of Economic Perspectives > Vol. 6, No. 4 (Autumn, 1992), pp. 53-82

  14. jian Says:

    If I am not mistaking, the mathematical question is: If done N times with large N, will switching make a difference in the total sum. Obviously not.

    Now, when one chooses just once, it is only a personality problem. Mathematical probability means large N, never N=1.

    Regarding Raymond Smullyan’s statement (see Omar’s comment, the very first one above). The second argument (t and 2t) is invalid. The person is solely choosing between t and 2t, he doesn’t “have” any sum yet, therefore there is no gain or loss.

  15. jian Says:

    Continuing last post:

    To make it more visual, consider (t, t^2) (both rounded to dollars), for large N, the total sum is still un-affected by switching (or not switching). But for N = 1, at least when the money in the envelope is less than some number, say, 1,000 dollars, every sensible person will switch the envelope, since switching means a 50% chance of winning a million dollars! But again, at N = 1, it is not a mathematical question.

  16. Guy Kindler Says:

    It seems that the crux of the paradox, put in a formal mathematical form, is the following: there exist two real valued random variables X and Y, such that E[X|Y]>Y and E[Y|X]>X. The existence of such random variables, which are actually also identically distributed, contradicts our intuition.

    Say that a r.v. X betters Y if E[X|Y]>Y, and that X worsens Y if E[X|Y]<Y. I think it may be an interesting (open?) question to understand which bettering/worsening relations are realizable within a set of random variables.

  17. Anonymous Says:

    Just take the heavier envelope.

  18. Peter Shor Says:

    One of my favorite probability paradoxes comes from Lewis Carroll, and is based on a quite similar fallacy. He calculated the probability that three random points in a plane form a triangle with all acute angles. This is a little bit hard to describe without a picture, but let’s try. Assume that XY is the longest side. Consider where the third point Z lies. It must clearly lie in the intersection of two circles centered on X and on Y, which is the set of points for which XY is the longest side. Now, if you draw the circle with diameter XY, you have the set of points for which XYZ has an obtuse angle. Take the ratio of the areas, and you obtain his answer.

    Frank Wattenberg, 80 later, realized you could do an analogous calculation if you assume that XY is the second largest side, but you obtain a different probability.

    There is quite a fascinating literature on this, much of which can be found by googling.

  19. nicolas Says:

    Anonymous, you’re hired.

  20. Jack Says:

    In fact, in the long run, it doesn’t matter if you switch envelopes or stick with your first choice. If the 2 envelopes have X and 2X in them, a person who always sticks will end up with an average of 1.5X and so will a person who always switches.

  21. Ben Says:

    I find it interesting that no one is talking about the initial probabilities of n. If you open an envelope with $1000 in it, it either means n=3 and you picked the small envelope, or n=2 and you picked the big envelope. But the problem states that n=2 is twice as likely as n=3. I’m not saying this has any relevance to the decision to switch. In fact, I don’t think it does. However, the reason why it’s not relevant is important.

  22. Gil Kalai Says:

    There is some resemblence between this paradox and the Newcomb paradox. Also there there are reasons that the dominant strategy be rejected but for the Newcomb’s paradox this is not because of unrealistic probabilistic distribution but because of strange logical connections between your action and a prediction that took place before the actions.

    As for this paradox, I do not see how the argument that the situation is untrue because the probability distribution is supposed to have infinite expectation can be rejected by the counter argument with the heaven story (either Tim’s or Terry’s version).

  23. Gil Says:

    Another variant on how to how to tell this paradox would be that after you get the envalopes you have to pay 1 dollar for switching the envalopes and 1 dollar to look at the content of the envalope you have.

  24. Ehud Friedgut Says:

    If we formalize everything we have no problem with the fact that there are two random variables X and Y and two ways to partition our probability space such that in one way the conditional expectation
    of X is always larger, and in the other way Y always wins.

    What bothers us is the implications of this on our actions in “real life”
    There are two operations we want to do:
    1) Choose between the envelopes in a manner that maximizes our expectation.
    2) Look at the content of the envelope we’re holding, in order to decide what to do.

    Obviously these two operations do not “commute” since before looking at the content the expectation is infinite, so action number 1 is meaningless.
    The whole “paradox” comes from our intuition that if whenever we do action 2 first we do the same thing in action 1 then the operations do commute. i.e. it is meaningful to do action 1 first.

    This is just a fallacy of our limited intuition that usually
    deals with random variables with finite expectation.
    I like to think of this as a version of Schroedinger’s cat:
    by looking at the content of the envelope we cause a collapse of the wave function.

  25. Anonymous Says:

    The Ramanujan sum of 1 + 2 + 3 + 4 + · · · is −1⁄12

    see http://en.wikipedia.org/wiki/1_%2B_2_%2B_3_%2B_4_%2B_%C2%B7_%C2%B7_%C2%B7

  26. Mark Bennet Says:

    I have always found this problem fascinating. I think, though that it depends on two assumtions which are not always analysed properly – first that any amount of money is divisible into two, and second that any amount of money can be doubled. Both of these assumptions fuzz the accurate identification of the correct sample space – the paradox is in the inadequate definition of the problem.

    As soon as you realise that money comes in multiples of a minimum unit, any odd number of units must be the lower of the two amounts. If you assume that there is a maximum amount of money, then anything over half of this must be the higher value and can only be halved.

    In a practicable sample space there are low [odd] numbers which can only be doubled, and high [even] numbers which can only be halved – and if you do the calculations for the expected gain on exchange of envelopes correctly you find that it comes to zero.

    [brackets because eg the space {1,2,4,8,16,32,64} has just one low odd number and one high even number, but the calculations still work if you do them - the halving of high numbers balances the doubling of low ones]

  27. Steven Says:

    The answer what to do is subjective, depending on how much you appreciate $2^n:

    for me, I need about $2 million to be sure I won’t have to work anymore and retire without financial worries (this is an age-dependent thing). So, 2^20$ makes me very happy indeed, whereas $2^19 would definitely make me less happy, but $2^21 is a bit overkill, and does not make me much happier than the $2^20 already does. That’s where I put my personal threshold, meaning that if I open the envelope and find $2^n, with n=20, I won’t switch.

    There is only a paradox if you try, for some reason, to maximize the expected amount of money rather than the expected amount of happiness.

  28. Steven Says:

    Hey, some part of my post got deleted: I meant I switch when n20. (I had used “< =” [without the space] signs, which apparently don’t work)

  29. Steven Says:

    Shoot: even a small than sign doesn’t work.

    Ok, I switch when n is below 21, but not when n is larger than 20.

  30. Mark Bennet Says:

    There is a related Henry Dudeney (I think this attribution is right, it might be Martin Gardner) puzzle on ties.

    Two friends enter a discussion about the respective merits of their ties. Each asserts that his/her tie is more expensive than the other. They agree that they should compare the cost of their ties, and the person who has the most expensive tie should give it to the other.

    Each reasons as follows.

    If my tie is cheaper, I gain a more expensive tie.
    If mine is the most expensive, I lose the tie I have.
    What I gain is more than what I lose.

  31. Florian Says:

    I think the proper explanation is the following:
    There are two answers, depending on the question:
    Why doesn’t it break probability theory? Because the expected amount of money in either envelope before opening one of them is infinite.
    Why doesn’t it work in practice? Because you will run out of dollars if n is too large.

    The paradox exists because you can’t decide whether you want to consider theory or practice. What you call “logical” is actually a mix of these two points of view.

  32. M-2: Two Probability Paradoxes « Concrete Nonsense Says:

    [...] original post [...]

  33. Eugene Says:

    it seems no envelope paradox:
    1. let’s assume full axe distribution of outcomes in envelope. you can be indebted is open envelope – then expectations are equal.
    2. now we assume that sums in envelope are positive – but with that restriction you automatically have predetermined probability that should be not higher than one third.

    i.e., envelope paradox is only a statement that probability of getting 2x couldn’t be higher than 1/3 on unlimited positive distribution.

  34. Bernard Kirzner, M.D. Says:

    You people have been in an ivory tower way too long.

    The calculated bet based upon the difference between the high vs. low is a matter of multiplication and absolute number. If the variable is 2, of course you want the 2x-x, not the x-1/2X. So?

    But mathematics will do nothing to point you in one direction or the other.

    The mathematics are only telling us how much of a difference there will be between the two outcomes, NOT WHICH DIRECTION TO GO.

    The mathematicians in this dicussion have mixed up the absolute amount of difference with the identification of which side of the multiplication you start at.

    The solution is being dicussed here as if it is simple. It’s just like the stock market, buy when low and sell when high.

    In this its just a matter of switching when you have the smaller amount and staying when you have the larger amount.

    There is no possible way in the real world where staying or switching can possibly be anything but equal bets. There is no mechanism by which it could be anything else but random, unless other factors, such as the suggested size of the money are factored in.

    The mathematics are just a distraction from the tougher issue, choosing which envelople to take. There is no added knowledge about location from the knowledge of absolute vs. relative amounts of money in the two envelopes.

    Pardon the reference, but I smell a faith healer making up excuses why the trick didn’t work?

    (Monty Hall problems have the same type of ivory tower problems, but that’s another discussion.)

  35. Tony Says:

    Hmm. Seems to me the problem as you’ve formulated it also has a close relation to the St Petersburg paradox: you toss a fair coin repeatedly until it comes up tails, and your payoff is 2^n where n is the number of heads. Since the expected value of the payoff is infinite, you can set any arbitrary price to play this game and still have a long-run expectation of coming out ahead over many games.

    In this case you have two envelopes each of which provides a St Petersburg style payoff. Of course once you know the contents of one envelope, it limits the possibilities, but your ultimate payoff is still of the St Petersburg kind.

    The real killer question is: should one be prepared to gamble the winnings of a St Petersburg game (offered for free but not yet played) against the outcome of a second St Petersburg game?

  36. Tony Says:

    To follow up my own post: David Chalmers has considered the question of deciding between the winnings of two St Petersburg games here.

    He also highlights the relevance of this problem to the original two-envelopes paradox.

  37. anon2 Says:

    The Smullyan’s version of the paradox (see Omar’s comment above) exists because it is an unbounded set with expectation value of an envelope being infinity.

    A very simple example to illustrate this is to consider a bounded set (1,2,4,8) and then work out the net value of switching. If you get 2, the net value of switching is indeed (4-2)-(2-1) = 1. However, the value of switching when you get 8 is -4, which balances the net value of switching for all the other options. It’s easy enough to see that this holds for any bounded set {2^i}: i running from 1 to a finite natural number.

    The other questions raised here also have similar resolutions, imho.

  38. anon2 Says:

    Btw, just for the sake of completeness, I want to mention that we can show the above relation for ANY ratio r, not just 2. Basically, the benefit of switching is positive for EVERY number except the greatest number in the set. If the set is infinite, once you know what the number in the envelope is (finite), you should switch! Just that you cannot make the claim that irrespective of what the number is, you should switch, because the loss from switching after obtaining the largest number in the set (tending to infinity) exactly cancels out the net benefit from switching after obtaining EVERY other number! So to repeat, if you know the number you obtained is not the largest number in the set, you should switch.

    Great problem! Was fun to solve!

  39. The Best Web Hosting Sites Says:

    It’s difficult to find knowledgeable people for this subject, however, you sound like you know what you’re talking about!
    Thanks

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Follow

Get every new post delivered to your Inbox.

Join 1,579 other followers

%d bloggers like this: