I can’t send any mail, since my mail server has been infected by a can of spam and so they’re firewalling off outgoing mail until they fix it. Meanwhile, I’m trying to switch to the University of Colorado’s mail server because I don’t agree with the policy that the current one is undertaking. Anyway, that means I get some time to take my mind off Perl 6 for a little while and exercise my mathematical fancy.

Here’s a very simple gambling game I came up with today. Not a game you could put in casinos; it’s too simple. But an interesting mathematical study case having to do with probabilities. Here’s the game: I give you a uniformy distributed random number in [0,1). You can either “hit” or “stay”. If you stay, I give you an amount of money proportional to your number (we’ll just say it’s that number of dollars, so you can make up to one dollar with rounding). If you hit, I give you another number, and you lose the previous one.

Here’s what makes it interesting: There is a fixed number of rounds, say, ten. If you hit ten times, you have to stay with the final number, no matter what it is. It could be 0.50 (the expectation), or 0.01, who knows? How do you devise strategy for maximizing the amount of money you get out of this game?

At each step in the game, you look at the current number and at the expected value of the game if you hit. If the current number is bigger, you stay, otherwise you hit. How do you compute the expected value? Well, it depends on your strategy (this is where this problem differs from most probability problems). So you have to compute it recursively. Here’s how it goes:

Call the expected value of the game with *m* moves left *ev(m)*. Clearly *ev(0)* is 1/2, since it could be anything, and it’s uniform.

Now, at each move in the game (with *m* moves left), you take the current number *n* and compare it against the expected value of the rest of the game *ev(m-1)*. To get *ev(m)* you do that for every possible thing that could come up, and multiply it by the probability of it happening: *ev(m) = ∫ _{0}^{1}(If n > ev(m-1) then n else ev(m-1))dn*. Integrals with conditionals inside them are usually pretty nasty, but this one can be broken up easily.

*ev(m) = ∫*, which, expanding it, turns into the recurrence relation:

_{0}^{ev(m-1)}ev(m-1)dn + ∫_{ev(m-1)}^{1}n dn*ev(m) = ev(m-1)*.

^{2}+ (1-ev(m-1)^{2})/2And there you go. The first few values of this look like:

n | ev(n) |

0 | 0.500 |

1 | 0.625 |

2 | 0.695 |

3 | 0.742 |

4 | 0.776 |

5 | 0.800 |

6 | 0.820 |

Which basically means that if you have *n* rounds left, then you should hit if your number is less than ev(n) and stay otherwise. We can see that this tends toward one as *n* goes to infinity, which is what we were expecting.

The reason I was interested in this problem came up near my death in a poker tournament. I had about two big blinds left, and I was on the button. I wondered, mathematically, what criteria my hand needed to satisfy in order for me to “go with it”. There I observed that it mattered how far away I was from the blind. That problem is closely related to this one. If I can assign a single number to the quality of a hand (that’s actually impossible, if you want to keep it probabilistically well-ordered), then I can find out when to go with it, and when to wait for the next one.