Open problems concerning card games

I’m glad to say that editorial work on the Princeton Companion is within a whisker of being completed (about three articles remain to be edited), so although I don’t quite feel that I have the leisure to give proper attention to this blog, which will be obvious from some of the messages that I haven’t got round to deleting, I can at least write a quick post. It starts with a conversation I had a couple of years ago. I was waiting for a plane to take me from Mykonos to Athens. The plane was severely delayed, but the situation could have been a lot worse as I had Persi Diaconis for company. He told me the not very surprising fact that it was not known what the probability of a win is in the game known as Patience in the UK and Solitaire in the US. (I’m talking about the one where you start by putting down a row of seven cards with just the first one face up, then on top of all but this first one a row of six cards with just the first one face up, and so on.) To be clear about the probability he is asking for, he simplifies the game by letting you see what all the cards are, so that you can play optimally and don’t have to worry about probabilities.

One thing he said that particularly struck me was, “If anyone can solve this I can guarantee them a front-page spread in the New York Times,” presumably an allusion to the attention he received for showing that you need seven riffle shuffles to shuffle a pack of cards properly. He also told me of someone he knew who had tried to estimate the probability by actually playing the game several times, but whose empirical evidence was flawed because he had a far from optimal strategy. Incidentally, I think Persi would be pretty impressed by any non-trivial rigorous bounds for the problem. (I’d count as trivial something like an upper bound that resulted from showing that you couldn’t win if a given arrangement of two or three cards occurred somewhere, but there may be more sophisticated “trivial” upper bounds too.)

That brings me to the main purpose of this post, which is to throw open a problem that I have idly wondered about for a long time. It concerns the game that I know as Beggar My Neighbour.

Well well. I thought I’d see if I could save myself some time by finding the rules for this game on the web. Sure enough, there they are on Wikipedia, but to my slight disappointment I see that what I thought was a nice problem I’d thought of myself (which in a sense it is) is in fact “a longstanding question in combinatorial game theory”. Let me nevertheless say what the question is and give a few thoughts about it. I’d also like to discuss a view that the article attributes to Conway. The article, by the way, is here if you don’t know this particular game (which also has various other names — another I had heard of is Strip Jack Naked).

Beggar My Neighbour is a game of pure luck: you deal out the cards and the outcome of the game is completely determined by the initial distribution of the cards. The question is simple: experience suggests that the game always finishes, but does it?

One can express the rules of the game as a fairly simple set of rules for a deterministic walk through the space of all possible configurations of cards, where a “configuration” could be defined as a permutation of the 52 cards, together with an integer k between 1 and 52 that tells you that the first k cards belong to the player who is about to play. Then the question is whether this walk has any cycles.

According to Wikipedia, Conway describes this as an “anti-Hilbert problem,” by which he means a problem that should definitely not set the direction for future mathematical research. While I of course see what he means, I can’t help finding the problem interesting (and I expect he can’t either). In particular, it has at least one good feature, which is that one can also give fairly well-motivated variants of it, as I’ll explain in a moment.

First, let me make one simple (and, I see from Wikipedia, well-known) observation. An obvious line of attack on the problem would be to generalize it and show that every Beggar-My-Neighbour type game had to terminate. But this is false. For instance, consider the reduced game where my hand is (2,2,jack,2) and yours is (jack,2). Suppose also that I start. Then I’ll put down a 2, you’ll put down a jack, I’ll put down a 2, and you’ll pick up. Now our hands are (jack,2) and (2,2,jack,2) and it’s your turn. Since this is exactly the position I was in at the start, the game goes on for ever.

Thus, there isn’t some general principle that tells you that a certain quantity always goes down as you play this sort of game.

For me that’s quite a compelling reason to agree with Conway’s judgement that the problem itself is an anti-Hilbert one. However, what about trying to explain the phenomenon that the game does in practice always seem to terminate? To do this it would be sufficient to prove that it terminates with massively high probability, or at least with high enough probability that on the rare occasions when the initial configuration is part of a cycle the players just get bored and stop without actually noticing that their game wouldn’t terminate if they continued.

Even this statistical question seems to be very hard, but there is a further simplification that makes it neater without reducing its mathematical interest. That is to take a pack of cards that just contains jacks and non-jacks. The rule is that if you play a jack and your opponent follows it with a non-jack then you pick up. I tried this with my daughter: all black cards had the role of jack, and all red cards had the role of non-jacks. I was quite surprised to discover that this was a reasonable game … and it terminated.

If, as I suspect, even this game is very hard to analyse statistically, it is still quite tempting to try to devise a convincing non-rigorous model for it that would “explain” the observed phenomenon that the game appears to terminate with very high probability, and perhaps even predict the distribution of how long it takes. One of the nice features of the game is that the procedure of putting down the cards alternately has a shuffling effect. This is highly relevant to the game, because it seems that a player gets a big advantage from a longish run that is dense with royal cards, but these runs get naturally dispersed unless both players have runs at the same time. Perhaps there is a way of modelling this dispersal effect probabilistically. Similarly, in the real game of Beggar-My-Neighbour, it makes a big difference who has the jacks, since it is quite hard to lose a jack. This is because after you play a jack, your opponent can win it only by playing a royal card and then going on to be the person who next picks up. In practice, this feels like a random event: you play a jack and then either breathe a sigh of relief if you don’t lose it or curse your luck if you do. And if, like most mortals, you’ve forgotten the order of the cards, then in a certain sense it is a random event, despite its completely deterministic nature. Perhaps one could devise a realistic “randomized Beggar-My-Neighbour” that was easier to study. Even something rather simple might help, such as occasionally inserting or removing random non-royal cards (which I myself know as “chicken-feed” for some reason) into hands. Perhaps if this was done at the right rate it would make certain events into ones that could be analysed probabilistically, but it would not destroy too badly the important features of the game such as the gradual dispersal of significant runs of cards. Of course, it might well be possible to prove that after a randomization procedure the game terminated with probability 1 because just occasionally something incredibly unlikely happened that caused it to terminate for trivial reasons. So the question would have to be to show that with high probability it terminates after not too ridiculously long.

And then of course one could try randomizing the simpler version of the game with just jacks and non-jacks.

To summarize, I think it is an interesting challenge (not Hilbert-interesting but certainly interesting given the current level of interest in probabilistic processes with simply described dependencies) to devise some simplified model of Beggar-My-Neighbour and prove rigorously that it terminates reasonably quickly with high probability. Indeed, I find this a more interesting question than the question of whether the actual game always terminates.

All these are very natural reactions to the original question, so it is probable that others have already had them. If you know of their having been raised, then I’d be interested to hear about it.

Wikipedia has a reference to an American Mathematical Monthly article on the subject. I can’t get hold of that, but have found a discussion elsewhere (by Richard Guy) that gives a few salient facts that I think are contained in it: there are no cycles if you halve the standard pack (this was discovered by a computer search), but there are if you add or take away two non-royal cards; and an arrangement has been found of the standard pack that takes 5790 goes before it terminates. 

12 Responses to “Open problems concerning card games”

  1. Chris Johnson Says:

    Very interesting article. The AMM paper lists the arrangement of the 52-card pack mentioned at the end of your article as being due to Michael Kleber and taking 805 moves, with 5790 cards dealt.

    Michael Kleber’s site ( http://people.brandeis.edu/~kleber/ ) lists a new record (as of July 2007) of 975 moves, by Richard Mann and Nick Wu at Oxford.

  2. John Armstrong Says:

    A quick note: that version of solitaire is known as “Klondike”.

    As for the main point, you say that there’s a cycle for a certain 50-card deck? Can’t this be bootstrapped into a cycle for the full deck? Or did you mean that there’s a known cycle for a certain 24-card deck?

  3. gowers Says:

    I meant 24, at least if I understood correctly what Richard Guy wrote.

    While I’m at it, here’s another, perhaps better, idea for a probabilistic model of the simplified version of the game. Instead of randomly inserting or removing cards, you randomly change the value of a random card (to jack if it’s a non-jack, and to non-jack if it’s a jack), according to some simple rule, such as each go making a change with probability p, with all decisions independent. The hope would be that for some fairly small p (if that last rule was adopted) the game could be shown to terminate in reasonable time with high probability.

    And here are a few more thoughts on the problem (added a bit later). The first is that there is a drastically simplified version of the problem that might actually give decent predictions: it would be interesting to find out. It can be defined for a standard pack but for simplicity let me just define it for the black/red = jack/non-jack game. You just randomize completely. In other words, each time you turn a card over, it’s a jack with probability 1/2 and a non-jack with probability 1/2. (One could actually play this game by tossing coins and never looking at a card.) I haven’t done the exercise, but this should give a pretty simple random walk on a very simple configuration space (the number of cards in the hand of one of the two players and whose turn it is). It would be interesting to know how the length of the game is distributed in this case, and also in the case where each card that appears is a jack with probability 1/13, a queen with probability 1/13, a king with probability 1/13, an ace with probability 1/13, and chicken-feed with probability 9/13.

    The second thought (which is bound to be standard) is that the game is reversible, in the sense that if you see the order of the cards at the end of the game, then you can work out how the entire game went up to that point. (The one thing you can’t tell is when the game started, but you sometimes reach a point where you cannot continue backwards, so you know that it started after that point.) It’s an easy exercise to check these facts.

    The third thought, which I’m surprised I didn’t have earlier, is that if one wants to analyse the deterministic version of the game, then dynamical systems could possibly help, since it is full of concepts that do justice to our intutions that deterministic processes can “behave randomly”. However, the fact that games terminate presents an initial technical problem that I don’t immediately see how to solve. (Perhaps one could let the number of cards tend to infinity and use the fact that the proportion of configurations that are terminal tends to zero, and thereby obtain a game that one could think of as Beggar-My-Neighbour as played in Asymptopia, or ultra-Beggar-My-Neighbour, or something.)

  4. Henry Wilton Says:

    My grandmother always referred to the non-royal cards as ‘cannon fodder’.

  5. Gil Kalai Says:

    A nice game of cards, especially since it requires absolutely no thinking, is ‘war’. You split the deck of cards equably for 2 or more players and each player in his turn put down his top card. The highest card takes all other and the round winner put his wins in a second pile to be used after the primary pile is exhausted. The exciting part of the game is called ‘war’ when two (or more) players put down the same card and there is a ‘tie’. The rule is that they put on this card their next three cards. This is repeated until there is a clear winner. Although most of my friends do not regard this game as sufficiently challenging it is a very nice game. (There are two variants depending on how you treat with your second pile when the primary pile was exhausted. In one version you take it as is and in the other you first shuffle it.)

    The solitaire version when the player represents 7-10 different players and see what happens is quite exciting as well. There is a Wikipedia article describing the game and a reference to a site with many variations.

    Probably, as simple the game is, all the natural probabilistic question you can ask about this game are rather hopeless. Amir Dembo and I considered once (in 1990 or so) a simplified version where you have a pack of n cards with distinct values. In this version there are no “wars” and while the game is still exciting the outcome is clear: the player with the highest card wins. I think we were able to show (for the version where the new piles are shuffled, (or perhaps even a further simplification where each time you pick a random card),) that the number of moves until victory
    is in the order of n^2 log n.

    Another nice variation of war is for three (or odd number of) players. In each round the middle card player wins and take the other cards. After each player played all his card (just once) the player who collected the medium amount of cards wins.

  6. Doug Says:

    Hi Tim,

    Lieven le Bruyn is discussing Surreal numbers and chess from a a combinatorial game theory perspective but with a reference to Conway ONAG.

    In this setting there would seem to be common ground between chess and cards?

  7. harrison Says:

    Hmm. I’ve seen a few places (Wikipedia, Guy) that mention that Conway listed this as an “anti-Hilbert problem,” but I can’t find an actual reference to Conway’s original article/statement. Anyone know one?

  8. Do Science Research By Playing a Game « The Number Warrior Says:

    […] Magic. It is possible some other recreational mathematics might work if modified, like a version of this card game Timothy Gowers recently wrote about. Possibly related posts: (automatically generated)Fold It!Wii Research Takes Easy to Measure […]

  9. Chelyabinsk Says:

    Somehow i missed the point. Probably lost in translation🙂 Anyway … nice blog to visit.

    cheers, Chelyabinsk.

  10. I. J. Kennedy Says:

    I have a .pdf copy of the Feb 1999 American Mathematical Monthly article you cite in this post. If you want it, drop me a line at jack at realmode.com. I would have just sent it to you but I couldn’t find your address anywhere.

  11. GB Says:

    If anyone is still out there googling around like I was and ending up here, Michael Kleber’s web site at Brandeis is no more, but I found Richard Mann’s. They have a new record of 1007 rounds (or “moves” or “tricks”). Wow!

    http://www.robots.ox.ac.uk/~rmann/research.php
    Starting hands:
    1: K-KK—-K-A—–JAA–Q–J-
    2: —Q—Q-J—–J——AQ–

    • Anonymous Says:

      GB –

      That link is now dead too. I can’t find anything from Richard Mann on Google. Also, the dashes in your listed starting hands seem to have formatted when you submitted your post. Any idea what it is actually supposed to be?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: