I’m glad to say that editorial work on the Princeton Companion is within a whisker of being completed (about three articles remain to be edited), so although I don’t quite feel that I have the leisure to give proper attention to this blog, which will be obvious from some of the messages that I haven’t got round to deleting, I can at least write a quick post. It starts with a conversation I had a couple of years ago. I was waiting for a plane to take me from Mykonos to Athens. The plane was severely delayed, but the situation could have been a lot worse as I had Persi Diaconis for company. He told me the not very surprising fact that it was not known what the probability of a win is in the game known as Patience in the UK and Solitaire in the US. (I’m talking about the one where you start by putting down a row of seven cards with just the first one face up, then on top of all but this first one a row of six cards with just the first one face up, and so on.) To be clear about the probability he is asking for, he simplifies the game by letting you see what all the cards are, so that you can play optimally and don’t have to worry about probabilities.
One thing he said that particularly struck me was, “If anyone can solve this I can guarantee them a front-page spread in the New York Times,” presumably an allusion to the attention he received for showing that you need seven riffle shuffles to shuffle a pack of cards properly. He also told me of someone he knew who had tried to estimate the probability by actually playing the game several times, but whose empirical evidence was flawed because he had a far from optimal strategy. Incidentally, I think Persi would be pretty impressed by any non-trivial rigorous bounds for the problem. (I’d count as trivial something like an upper bound that resulted from showing that you couldn’t win if a given arrangement of two or three cards occurred somewhere, but there may be more sophisticated “trivial” upper bounds too.)
That brings me to the main purpose of this post, which is to throw open a problem that I have idly wondered about for a long time. It concerns the game that I know as Beggar My Neighbour.
Well well. I thought I’d see if I could save myself some time by finding the rules for this game on the web. Sure enough, there they are on Wikipedia, but to my slight disappointment I see that what I thought was a nice problem I’d thought of myself (which in a sense it is) is in fact “a longstanding question in combinatorial game theory”. Let me nevertheless say what the question is and give a few thoughts about it. I’d also like to discuss a view that the article attributes to Conway. The article, by the way, is here if you don’t know this particular game (which also has various other names — another I had heard of is Strip Jack Naked).
Beggar My Neighbour is a game of pure luck: you deal out the cards and the outcome of the game is completely determined by the initial distribution of the cards. The question is simple: experience suggests that the game always finishes, but does it?
One can express the rules of the game as a fairly simple set of rules for a deterministic walk through the space of all possible configurations of cards, where a “configuration” could be defined as a permutation of the 52 cards, together with an integer k between 1 and 52 that tells you that the first k cards belong to the player who is about to play. Then the question is whether this walk has any cycles.
According to Wikipedia, Conway describes this as an “anti-Hilbert problem,” by which he means a problem that should definitely not set the direction for future mathematical research. While I of course see what he means, I can’t help finding the problem interesting (and I expect he can’t either). In particular, it has at least one good feature, which is that one can also give fairly well-motivated variants of it, as I’ll explain in a moment.
First, let me make one simple (and, I see from Wikipedia, well-known) observation. An obvious line of attack on the problem would be to generalize it and show that every Beggar-My-Neighbour type game had to terminate. But this is false. For instance, consider the reduced game where my hand is (2,2,jack,2) and yours is (jack,2). Suppose also that I start. Then I’ll put down a 2, you’ll put down a jack, I’ll put down a 2, and you’ll pick up. Now our hands are (jack,2) and (2,2,jack,2) and it’s your turn. Since this is exactly the position I was in at the start, the game goes on for ever.
Thus, there isn’t some general principle that tells you that a certain quantity always goes down as you play this sort of game.
For me that’s quite a compelling reason to agree with Conway’s judgement that the problem itself is an anti-Hilbert one. However, what about trying to explain the phenomenon that the game does in practice always seem to terminate? To do this it would be sufficient to prove that it terminates with massively high probability, or at least with high enough probability that on the rare occasions when the initial configuration is part of a cycle the players just get bored and stop without actually noticing that their game wouldn’t terminate if they continued.
Even this statistical question seems to be very hard, but there is a further simplification that makes it neater without reducing its mathematical interest. That is to take a pack of cards that just contains jacks and non-jacks. The rule is that if you play a jack and your opponent follows it with a non-jack then you pick up. I tried this with my daughter: all black cards had the role of jack, and all red cards had the role of non-jacks. I was quite surprised to discover that this was a reasonable game … and it terminated.
If, as I suspect, even this game is very hard to analyse statistically, it is still quite tempting to try to devise a convincing non-rigorous model for it that would “explain” the observed phenomenon that the game appears to terminate with very high probability, and perhaps even predict the distribution of how long it takes. One of the nice features of the game is that the procedure of putting down the cards alternately has a shuffling effect. This is highly relevant to the game, because it seems that a player gets a big advantage from a longish run that is dense with royal cards, but these runs get naturally dispersed unless both players have runs at the same time. Perhaps there is a way of modelling this dispersal effect probabilistically. Similarly, in the real game of Beggar-My-Neighbour, it makes a big difference who has the jacks, since it is quite hard to lose a jack. This is because after you play a jack, your opponent can win it only by playing a royal card and then going on to be the person who next picks up. In practice, this feels like a random event: you play a jack and then either breathe a sigh of relief if you don’t lose it or curse your luck if you do. And if, like most mortals, you’ve forgotten the order of the cards, then in a certain sense it is a random event, despite its completely deterministic nature. Perhaps one could devise a realistic “randomized Beggar-My-Neighbour” that was easier to study. Even something rather simple might help, such as occasionally inserting or removing random non-royal cards (which I myself know as “chicken-feed” for some reason) into hands. Perhaps if this was done at the right rate it would make certain events into ones that could be analysed probabilistically, but it would not destroy too badly the important features of the game such as the gradual dispersal of significant runs of cards. Of course, it might well be possible to prove that after a randomization procedure the game terminated with probability 1 because just occasionally something incredibly unlikely happened that caused it to terminate for trivial reasons. So the question would have to be to show that with high probability it terminates after not too ridiculously long.
And then of course one could try randomizing the simpler version of the game with just jacks and non-jacks.
To summarize, I think it is an interesting challenge (not Hilbert-interesting but certainly interesting given the current level of interest in probabilistic processes with simply described dependencies) to devise some simplified model of Beggar-My-Neighbour and prove rigorously that it terminates reasonably quickly with high probability. Indeed, I find this a more interesting question than the question of whether the actual game always terminates.
All these are very natural reactions to the original question, so it is probable that others have already had them. If you know of their having been raised, then I’d be interested to hear about it.
Wikipedia has a reference to an American Mathematical Monthly article on the subject. I can’t get hold of that, but have found a discussion elsewhere (by Richard Guy) that gives a few salient facts that I think are contained in it: there are no cycles if you halve the standard pack (this was discovered by a computer search), but there are if you add or take away two non-royal cards; and an arrangement has been found of the standard pack that takes 5790 goes before it terminates.