On the Cards
After an interesting chat yesterday with a colleague about the difficulties involved in teaching probabilities, I thought it might be fun to write something about card games. Actually, much of science is intimately concerned with statistical reasoning and if any one activity was responsible for the development of the theory of probability, which underpins statistics, it was the rise of games of chance in the 16th and 17th centuries. Card, dice and lottery games still provide great examples of how to calculate probabilities, a skill which is very important for a physicist.
For those of you who did not misspend your youth playing with cards like I did, I should remind you that a standard pack of playing cards has 52 cards. There are 4 suits: clubs (♣), diamonds (♦), hearts (♥) and spades (♠). Clubs and spades are coloured black, while diamonds and hearts are red. Each suit contains thirteen cards, including an Ace (A), the plain numbered cards (2, 3, 4, 5, 6, 7, 8, 9 and 10), and the face cards: Jack (J), Queen (Q), and King (K). In most games the most valuable is the Ace, following by King, Queen and Jack and then from 10 down to 2.
I’ll start with Poker, because it seems to be one of the simplest ways of losing money these days. Imagine I start with a well-shuffled pack of 52 cards. In a game of five-card draw poker, the players essentially bet on who has the best hand made from five cards drawn from the pack. In more complicated versions of poker, such as Texas hold’em, one has, say, two “private” cards in one’s hand and, say, five on the table in plain view. These community cards are usually revealed in stages, allowing a round of betting at each stage. One has to make the best hand one can using five cards from ones private cards and those on the table. The existence of community cards makes this very interesting because it gives some additional information about other player’s holdings. For the present discussion, however, I will just stick to individual hands and their probabilities.
How many different five-card poker hands are possible?
To answer this question we need to know about permutations and combinations. Imagine constructing a poker hand from a standard deck. The deck is full when you start, which gives you 52 choices for the first card of your hand. Once that is taken you have 51 choices for the second, and so on down to 48 choices for the last card. One might think the answer is therefore 52×51×50×49 ×48=311,875,200, but that’s not right because it doesn’t actually matter which order your five cards are dealt to you.
Suppose you have 4 aces and the 2 of clubs in your hand; the sequences (A♣, A♥, A♦, A♠, 2♣) and (A♥ 2♣ A♠, A♦, A♣) are counted as distinct hands among the number I obtained above. There are many other possibilities like this where the cards are the same but the order is different. In fact there are 5×4×3×2× 1 = 120 such permutations . Mathematically this is denoted 5!, or five-factorial. Dividing the number above by this gives the actual number of possible five-card poker hands: 2,598,960. This number is important because it describes the size of the “possibility space”. Each of these hands is a possible poker deal, and each is assumed to be “equally likely”, unless the dealer is cheating.
This calculation is an example of a mathematical combination as opposed to a permutation. The number of combinations one can make of r things chosen from a set of n is usually denoted Cn,r. In the example above, r=5 and n=52. Note that 52×51×50×49 ×48 can be written 52!/47! The general result for the number of combinations can likewise be written Cn,r=n!/(n-r)!r!
Poker hands are characterized by the occurrence of particular events of varying degrees of probability. For example, a flush is five cards of the same suit but not in sequence (e.g. 2♠, 4♠, 7♠, 9♠, Q♠). A numerical sequence of cards regardless of suit (e.g. 7♣, 8♠, 9♥, 10♦, J♥) is called a straight. A sequence of cards of the same suit is called a straight flush. One can also have a pair of cards of the same value, or two pairs, or three of a kind, or four of a kind, or a full house which is three of one kind and two of another. One can also have nothing at all, i.e. not even a pair.
The relative value of the different hands is determined by how probable they are, and to work that out takes quite a bit of effort.
Consider the probability of getting, say, 5 spades (in other words, spade flush). To do this we have to calculate the number of distinct hands that have this composition.There are 13 spades in the deck to start with, so there are 13×12×11×10×9 permutations of 5 spades drawn from the pack, but, because of the possible internal rearrangements, we have to divide again by 5! The result is that there are 1287 possible hands containing 5 spades. Not all of these are mere flushes, however. Some of them will include sequences too, e.g. 8♠, 9♠, 10♠, J♠, Q♠, which makes them straight flushes. There are only 10 possible straight flushes in spades (starting with 2,3,4,5,6,7,8,9,10 or J), so only 1277 of the possible hands counted above are just flushes. This logic can apply to any of the suits, so in all there are 1277×4=5108 flush hands and 10×4=40 straight flush hands.
I won’t go through the details of calculating the probability of the other types of hand, but I’ve included a table showing their probabilities obtained by dividing the relevant number of possibilities by the total number of hands (given at the bottom of the middle column).
TYPE OF HAND |
Number of Possible Hands |
Probability |
Straight Flush |
40 |
0.000015 |
Four of a Kind |
624 |
0.000240 |
Full House |
3744 |
0.001441 |
Flush |
5108 |
0.001965 |
Straight |
10,200 |
0.003925 |
Three of a Kind |
54,912 |
0.021129 |
Two Pair |
123,552 |
0.047539 |
One Pair |
1,098,240 |
0.422569 |
Nothing |
1,302,540 |
0.501177 |
TOTALS |
2,598,960 |
1.00000 |
Poker involves rounds of betting in which the players, amongst other things, try to assess how likely their hand is to win compared with the others involved in the game. If your hand is weak, you can fold and allow the accumulated bets to be given to your opponents. Alternatively, you can bluff and bet strongly on a poor hand (even if you have “nothing”) to convince your opponents that your hand is strong. This tactic can be extremely successful in the right circumstances. In the words of the late great Paul Newman in the film Cool Hand Luke, “sometimes nothing can be a real cool hand”.
If you bet heavily on your hand, the opponent may well think it is strong even if it contains nothing, and fold even if his hand has a higher value. To bluff successfully requires a good sense of timing – it depends crucially on who gets to bet first – and extremely cool nerves. To spot when an opponent is bluffing requires real psychological insight. These aspects of the game are in many ways more interesting than the basic hand probabilities, and they are difficult to reduce to mathematics.
Another card game that serves as a source for interesting problems in probability is Contract Bridge. This is one of the most difficult card games to play well because it is a game of logic that also involves chance to some degree. Bridge is a game for four people, arranged in two teams of two. The four sit at a table with members of each team facing opposite each other. Traditionally the different positions are called North, South, East and West although you don’t actually need a compass to play. North and South are partners, as are East and West.
For each hand of Bridge an ordinary pack of cards is shuffled and dealt out by one of the players, the dealer. Let us suppose that the dealer in this case is South. The pack is dealt out one card at a time to each player in turn, starting with West (to dealer’s immediate left) then North and so on in a clockwise direction. Each player ends up with thirteen cards when all the cards are dealt.
Now comes the first phase of the game, the auction. Each player looks at their cards and makes a bid, which is essentially a coded message that gives information to their partner about how good their hand is. A bid is basically an undertaking to win a certain number of tricks with a certain suit as trumps (or with no trumps). The meaning of tricks and trumps will become clear later. For example, dealer might bid “one spade” which is a suggestion that perhaps he and his partner could win one more trick than the opposition with spades as the trump suit. This means winning seven tricks, as there are always thirteen to be won in a given deal. The next to bid – in this case West – can either pass (saying “no bid”) or bid higher, like an auction. The value of the suits increases in the sequence clubs, diamonds, hearts and spades. So to outbid one spade (1S), West has to bid at least two hearts (2H), say, if hearts is the best suit for him but if South had opened 1C then 1H would have been sufficient to overcall . Next to bid is South’s partner, North. If he likes spades as trumps he can raise the original bid. If he likes them a lot he can jump to a much higher contract, such as four spades (4S).
This is the most straightforward level of Bridge bidding, but in reality there are many bids that don’t mean what they might appear to mean at first sight. Examples include conventional bids (such as Stayman or Blackwood), splinter and transfer bids and the rest of the complex lexicon of Bridge jargon. There are some bids to which partner must respond (forcing bids), and those to which a response is discretionary. And instead of overcalling a bid, one’s opponents could “double” either for penalties in the hope that the contract will fail or as a “take-out” to indicate strength in a suit other than the one just bid.
Bidding carries on in a clockwise direction until nobody dares take it higher. Three successive passes will end the auction, and the contract is then established. Whichever player opened the bidding in the suit that was finally chosen for trumps becomes “declarer”. If we suppose our example ended in 4S, then it was South that becomes declarer because he opened the bidding with 1S. If West had overcalled 2 Hearts (2H) and this had passed round the table, West would be declarer.
The scoring system for Bridge encourages teams to go for high contracts rather than low ones, so if one team has the best cards it doesn’t necessarily get an easy ride; it should undertake an ambitious contract rather than stroll through a simple one. In particular there are extra points for making “game” (a contract of four spades, four hearts, five clubs, five diamonds, or three no trumps). There is a huge bonus available for bidding and making a grand slam (an undertaking to win all thirteen tricks, i.e. seven of something) and a smaller but still impressive bonus for a small slam (six of something). This encourages teams to push for a valuable contract: tricks bid and made count a lot more than overtricks even without the slam bonus.
The second phase of the game now starts. The person to the left of declarer plays a card of their choice, possibly following yet another convention, such as “fourth highest of the longest suit”. The player opposite declarer puts all his cards on the table and becomes “dummy”, playing no further part in this particular hand. Dummy’s cards are then entirely under the control of the declarer. All three players can see the cards in dummy, but only declarer can see his own hand. Apart from the role of dummy, the card play is then similar to whist.
Each trick consists of four cards played in clockwise sequence from whoever leads. Each player, including dummy, must follow the suit led if he has a card of that suit in his hand. If a player doesn’t have a card of that suit he may “ruff”, i.e. play a trump card, or simply discard some card (probably of low value) from another suit. Good Bridge players keep a careful track of all discards to improve their knowledge of the cards held by their opponents. Discards can also be used by the defence (i.e. East and West in this case) to signal to each other. Declarer can see dummy’s cards but the defenders don’t get to see each other’s.
One can win a trick in one of two ways. Either one plays a higher card of the same suit, e.g. K♥ beats 10♥, or anything lower than Q♥. Aces are high, by the way. Alternatively, if one has no cards of the suit that has been led, one can play a trump (or “ruff”). A trump always beats a card of the original suit, but more than one player may ruff and in that case the highest trump played carries the trick. For instance, East may ruff only to be over-ruffed by South if both have none of the suit led. Of course one may not have any trumps at all, making a ruff impossible. If one has neither the original suit nor a trump one has to discard something from another suit. The possibility of winning a trick by a ruff also does not exist if the contract is of the no-trumps variety.
Whoever wins a given trick leads to start the next one. This carries on until thirteen tricks have been played. Then comes the reckoning of whether the contract has been made. If so, points are awarded to declarer’s team. If not, penalty points are awarded to the defenders which are higher if the contract has been doubled. Then it’s time for another hand, probably another drink, and very possibly an argument about how badly declarer played the hand.
I’ve gone through the game in some detail in an attempt to make it clear why this is such an interesting game for probabilistic reasoning. During the auction, partial information is given about every player’s holding. It is vital to interpret this information correctly if the contract is to be made. The auction can reveal which of the defending team holds important high cards, or whether the trump suit is distributed strangely. Because the cards are played in strict clockwise sequence this matters a lot. On the other hand, even with very firm knowledge about where the important cards lie, one still often has a difficult logical puzzle to solve if all the potential winners in one’s hand are actually to be made into tricks. It can be a very subtle game.
I only have space-time for one illustration of this kind of thing, but it’s another one that is fun to work out. As is true to a lesser extent in poker, one is not really interested in the initial probabilities of the different hands but rather how to update these probabilities using conditional information as it may be revealed through the auction and card play. In poker this updating is done largely by interpreting the bets one’s opponents are making.
Let us suppose that I am South, and I have been daring enough to bid a grand slam in spades (7S). West leads, and North lays down dummy. I look at my hand and dummy, and realise that we have 11 trumps between us, missing only the King (K) and the 2. I have all other suits covered, and enough winners to make the contract provided I can make sure I win all the trump tricks. The King, however, poses a problem. The Ace of Spades will beat the King, but if I just lead the Ace, it may be that one of East or West has both the K and the 2. In this case he would simply play the two to my Ace. The King would be an automatic winner then: as the highest remaining trump it must win a trick eventually. The contract is then doomed.
On the other hand if the spades split 1-1 between East and West then the King drops when I lead the Ace, so that strategy makes the contract. It all depends how the cards split.
But there is a different way to play this situation. Suppose, for example, that A♠ and Q♠ are on the table (in dummy’s hand) and I, as declarer, have managed to win the first trick in my hand. If I think the K♠ lies in West’s hand, I lead a spade. West has to follow suit if he can. If he has the King, and plays it, I can cover it with the Ace so it doesn’t win. If, however, West plays low I can play Q♠. This will win if I am right about the location of the King. Next time I can lead the A♠ from dummy and the King will fall. This play is called a finesse.
But is this better than the previous strategy, playing for the drop? It’s all a question of probabilities, and this in turn boils down to the number of possible deals allow each strategy to work.
To start with, we need the total number of possible bridge hands. This is quite easy: it’s the number of combinations of 13 objects taken from 52, i.e. C52,13. This is a truly enormous number: over 600 billion. You have to play a lot of games to expect to be dealt the same hand twice!
What we now have to do is evaluate the probability of each possible arrangement of the missing King and two. Dummy and declarer’s hands are known to me. There are 26 remaining cards whose location I do not know. The relevant space of possibilities is now smaller than the original one. I have 26 cards to assign between East and West. There are C26,13 ways of assigning West’s 13 cards, but once I have done this the remaining 13 must be in East’s hand.
Suppose West has the 2 but not the K. Conditional on this assumption, I know one of his cards, but there are 12 others remaining to be assigned. There are therefore C24,12 hands with this possible arrangement of the trumps. Obviously the K has to be with East in this case. The finesse would not work as East would cover the Q with the K, but the K would drop if the A were played.
The opposite situation, with West having the K but not the 2 has the same number of possibilities associated with it. Here West must play the K when a spade is led so it will inevitably lose to the A. South abandons the idea of finessing when West rises and just covers it with the higher card.
Suppose instead West doesn’t have any trumps. There are C24,13 ways of constructing such a hand: 13 cards from the 24 remaining non-trumps. Here the finesse fails because the K is with East but the drop fails too. East plays the 2 on the A and the K becomes a winner.
The remaining possibility is that West has both trumps: this can happen in C24,11 ways. Here the finesse works but the drop fails. If West plays low on the South lead, declarer calls for the Q from dummy to hold the trick. Next lead he plays the A to drop the K.
To turn these counts into probabilities we just divide by the total number of different ways I can construct the hands of East and West, which is C26,13. The results are summarized in the table here.
Spades in West’s hand |
Number of hands |
Probability |
Drop |
Finesse |
None |
C24,13 |
0.24 |
0 |
0 |
K |
C24,12 |
0.26 |
0.26 |
0.26 |
2 |
C24,12 |
0.26 |
0.26 |
0 |
K2 |
C24,11 |
0.24 |
0 |
0.24 |
Total |
C26,13 |
1.00 |
0.52 |
0.50 |
The last two columns show the contributions of each arrangement to the probability of success of either playing for the drop or the finesse. You can see that the drop is slightly more likely to work than the finesse in this case.
Note, however, that this ignores any information gleaned from the auction, which could be crucial. For example, if West had made a bid then it is more likely that he had cards of some value so this might suggest the K might be in his hand. Note also that the probability of the drop and the probability of the finesse do not add up to one. This is because there are situations where both could work or both could fail.
This calculation does not mean that the finesse is never the right tactic. It sometimes has much higher probability than the drop, and is often strongly motivated by information the auction has revealed. Calculating the odds precisely, however, gets more complicated the more cards are missing from declarer’s holding. For those of you too lazy to compute the probabilities, the book On Gambling, by Oswald Jacoby contains tables of the odds for just about any bridge situation you can think of.
Finally on the subject of Bridge, I wanted to mention a fact that many people think is paradoxical but which isn’t really. Looking at the table shows that the odds of a 1-1 split in spades here are 0.52:0.48 or 13: 12. This comes from how many cards are in East and West’s hand when the play is attempted. There is a much quicker way of getting this answer than the brute force method I used above. Consider the hand with the spade two in it. There are 12 remaining opportunities in that hand that the spade K might fill, but there are 13 available slots for it in the other. The odds on a 1-1 split must therefore be 13:12. Now suppose instead of going straight for the trumps, I play off a few winners in the side suits (risking that they might be ruffed, of course). Suppose I lead out three Aces in the three suits other than spades and they all win. Now East and West have only 20 cards between them and by exactly the same reasoning as before, the odds of a 1-1 split have become 10:9 instead of 13:12. Playing out seemingly irrelevant suits has increased the probability of the drop working. Although I haven’t touched the spades, my assessment of the probability of the spade distribution has changed significantly.
This sort of thing is a major reason why I always think of probabilities in a Bayesian way. As information is gradually revealed one updates the assessment of the probability of the remaining unknowns.
But probability is only a part of Bridge; the best players don’t actually leave very much to chance…
January 28, 2009 at 11:17 am
Peter,
Thanks for the excellent summary of Bridge. It’s a dangerous game – people have shot their partners (not the opposing pair!) As a mathematical physicist I preferred my intellectual hobbies to be totally unrelated (such as a good history book), but that’s a personal choice.
Tournament bridge really needs to be played by individuals in four separate rooms who can communicate only by computer. There are too many covert ways for partners to arrange to communicate ‘forbidden’ information to each other using body language – or even by the timing of a bid (ie, how many seconds you appear to think before speaking your bid). Or is that allowed?
Games and gambling were not the spur to the quantification of probability theory that they are commonly supposed to be. They are prominent in the very early statistical literature because they were the best tutorial problems, being simple enough to solve yet non-trivial, and involving a subject that educated people were familiar with. In other words, the prominence of gambling in the early literature is a selection effect. The real spur to quantification was something far more central to civilisation: Law. Court proceedings are implicitly concerned with the probability that the accused is guilty, given the evidence. All of the pioneers of quantitative probability from the 17th century were either lawyers or sons of lawyers. (The subject as mathematics is traditionally dated from a correspondence between Pascal and Fermat in the middle of that century.)
Historians of probability theory who approached their subject mathematically supposed that the subject began in the Pascal-Fermat correspondence, but the 17th century pioneers simply quantified pre-existing concepts of reasoning under uncertainty. The earlier development of those concepts, paying particular attention to law, is covered in an excellent book “The Science of Conjecture” by the Australian scholar Jim Franklin. In a remarkable addendum he shows how law is symbiotic with many other intellectual aspects of civilisation, including mathematics itself, rhetoric, logic, ethics and politics.
Anton
January 28, 2009 at 12:31 pm
Anton,
For some reason your comment was initially labelled as “spam” and caught in my filter. I don’t know why, as you have posted many other comments that got through without a problem. The other spam messages were all adverts for online poker. Maybe I attracted them by blogging about cards. Anyway, I found your note just now while I was looking through the trash, rescued it and put it in its proper place.
I bow to your better knowledge of the history of probability. I think perhaps I was searching for a justification of my own love of gambling games. I’ve read quite a lot about the use of probability in jurisprudence and it’s a tragedy how badly this is handled in the courts these days.
In competition bridge one generally doesn’t actually speak the bid but points at a card on which all possible bids are shown. This eliminates some of the possibility of cheating by using voice inflection or stressing words differently: “ONE spade” might mean something different to “one SPADE” and “No Bid” might be different to “pass”. Coughing and banging the table can also provide subtle clues. But using the card doesn’t eliminate the possibility of cheating altogether and it does happen even in international competitions. In competition bridge you are also forbidden to talk during the play.
The last time I played in a Bridge competition was in Toronto and it turned out that not only were these rules enforced strictly but there was also a ban on alcoholic drinks at the venue. Although I played OK that night, I found the lack of conversation and alcohol made the evening entirely tedious.
In social bridge you can be a bit more relaxed about such things. Indeed, the liberal consumption of wine and spirits can be thought of as a rudimentary kind of handicap system. But there is one rule that I strongly believe in. Married people (or partners in the romantic sense) are not allowed to be partners at the Bridge table. This is NOT (as usually supposed) to prevent collusion. It’s actually to prevent arguments on the way home!
Note also that Bridge requires players to count the cards. I will never play games like Blackjack where you get chucked out of the Casino for counting, i.e. playing properly.
Peter
January 28, 2009 at 1:01 pm
I was wondering where it had gone…
The difficult part in court proceedings is how to combine quantitative data (such as DNA evidence) with qualitative evidence. This is doubly difficult when the DNA evidence involves a probability close to 0 or 1 (as is usual) because the human brain is best at handling uncertainty in the mid-region of the interval. Eventually DNA testing will become good enough to be unique (apart from identical twins), so that only contamination need be taken into account. I find it fascinating that people who committed murder in the 1960s and got away with it are now being nicked on the basis of DNA samples from preserved evidence.
I can think of plenty of covert communication channels in Bridge even if speech is prohibited and you bid by pointing to your chosen bid on a card. Which finger do you point with; does your finger approach the bid from left or right, above or below; how long do you delay before making your bid, etc. I’ll bet James Randi could work out a system that communicates almost everything in your hand to your partner.
I loved the book “The Newtonian Casino” about the pioneers of covert realtime computing in a casino. They made all the mistakes that allowed others to profit from what they did. But they had fun building computers from scratch in shoes, and having keypads stuck to their stomachs, etc.
Anton
PS Is it gambling if there is an element of skill?
January 28, 2009 at 1:07 pm
BTW, all of the slick shortcuts in probability calculations correspond to clever use of the sum and product rules, to get from the probabilities you have to the probabilities you want. Doing it the longer way will still give the same answer, though – the form of the sum and product rules guarantees that. This is in fact the basis of the deepest derivation of these rules, by the unsung hero RT Cox in 1946.
January 28, 2009 at 1:14 pm
Bridge competitions have officials that try to spot attempts to cheat but there are still some that get through, I’m sure.
I should also mention that one can also play duplicate bridge, in which the teams are of four (two pairs). The same cards are dealt at two tables (between which no communication is possible). Team A will have the N-S cards in one room and E-W in the other, and vice-versa. The idea is then that the actual deals don’t matter: each team gets the good cards in one room or the other. The scoring depends only on the difference in performance in the two rooms. If both teams make a grand slam sitting N-S in their respective rooms then there is no score. If one makes a contract and the other doesn’t there is net score to the successful team.
That eliminates some of the problems of rubber bridge (with only 4) is that it is possible for one pair to get systematically worse cards through a whole evening.
Most card games do involve skill at some level so I suppose the only true gambling games are things like roulette. But then there is skill there too, in looking at the rules and deciding whether you should play in the first place.
September 5, 2009 at 6:28 am
[…] by the 14th Century. This is an interesting development because playing cards can be used for games such as contract Bridge which involve a great deal of pure skill as well as an element of randomness. Perhaps it is this […]
April 12, 2012 at 9:44 am
[…] have in the past used Bridge (and other card games) to describe how Bayesian probability works; see this rather lengthy post for more details. The point is that as cards are played, one’s calculation of the […]