A while back I mentioned the St. Petersburg paradox. It's a hypothetical gambling scenario where you win money based on the outcome of a coin toss. If you get your first tails on the first throw, you get $1. If you get one head before your first tails, you get $2 dollars. If you get two heads before tails, you get $4. If three heads before tails, you get $8, and so on doubling each time.
How much should you be willing to pay to play this game? If you work out the mathematical expected value of the game, it turns out to be infinite. Play this game enough times and it doesn't matter how much it costs to play, you will certainly come out ahead in the long run. It might be the very long run, because so much of the value of the game is the one in a bazilion chance to win a bazilion dollars. But formally, any casino that offers this game at any price will eventually go broke.
There's a lot of discussion of this result and why so many people wouldn't be willing to pay much to play it despite its mathematical value. For most people (including me), it boils down to the fact that while we might jump at the chance to bet a dollar on a 2% chance at winning $100, we'd hesitate to bet ten thousand dollars on a 2% chance at winning a million dollars. The math is the same, but what happens to us after the very likely loss is not.
But real life doesn't quite fit this mathematical abstraction anyway. Any casino that offers this bet will only have finite resources. Therefore the bet they're offering is to toss the coin either until tails comes up or until the casino goes broke. And that changes the math quite a bit. I'll follow the argument on the linked Wikipedia article, since it's a nice summary and it gets me out of typesetting it myself because ScienceBlogs still doesn't have LaTeX.
Since your winnings double for each head, the maximum number of times you can flip heads before the casino goes broke is L = 1 + Floor[log(W)], where W is the casino's total assets and log is the base 2 logarithm. The floor function is just a function that leaves whole numbers unchanged and rounds everything else down to the nearest whole number. For instance, Floor[2.9] = 2.
So we calculate the expected winnings thusly:
The first line is pretty easy. It's the probability of tossing k heads times the winnings you'll get if you toss that many heads. It'll be whichever is smaller: 2^k dollars or the casino's total assets because you just bankrupted them.
The second line splits the sum for each of those possibilties. The first term is the pre-bankrupting winnings and the second is the post-bankrupting winnings. The third line calculates those sums. Remember L is a function of W, the casino's total holdings.
We can plot this. Notice I'm using a log-linear plot, so look carefully at the x-axis. The graph is giving the expected winnings on the y-axis compared to the casino's holdings:
This is pretty instructive. If your friend offers you the bet but he'll only pay up to $100, then you shouldn't be willing to pay more than about $4.28 if you want to come out ahead. But even if the maximum winnings are a million dollars, it's still only worth about $10.95. And even if you were offered maximum winnings of trillions of dollars you'd be justified in paying much more than a Jackson. The logarithmic function is very slowly growing, and so that infinite expected value only comes into play because of the infinite possible winnings. Such a situation will not obtain in Vegas, so you shouldn't go betting your house to play the game. After all, they have statisticians too.
- Log in to post comments
Hi,
Could you give me some idea of why you can go from line 1 to 2?
Thanks
@1,
Yes. If k <= L - 1 = floor(log2(W)), then 2^k <= W. Therefore for k = 0, 1, ..., L-1, we have min(2^k, W) = 2^k. However, for k=L, L+1, ... we have min(2^k, W) = W.
Hence the split sum in the second line.
Huh. Looks like the software does not like the less than sign.
In words, for k less than or equal to L-1, the min of 2 to the kth power and W is 2 raised to the k power. For k equal to L or greater, W is the minimum. Therefore we can split the sum into the two parts.
Equivalently: if bet one dollar on "heads" and double down on every tail, you will always come out one dollar ahead. The idea fails because you do not have an infinite purse to gamble with. For any finite purse, the expected value is zero.
Hi there,
there is actually a clean mathematical solution to the problem, showing that the correct price for such a betting game is $1.
The paradox arises because the problem usually is not well formulated and then the limit is taken in an unclean way. If you alter the game such that you play it only N times, you will see that by using the standard formula for the game's estimated payoff, the probabilities are not normalized (<1). Because off this you can actually not take the limit as it is usually done . If you however rewrite the payoff by going through all the different possibilities and their payoffs for N-toss game and then take the limit, the paradox disappears.
@Matt: If you send me your email-address I can send you a pdf, explaining this in more detail.
cheers,
Georges
uups, there was a character that the page did not like. There is the rest of my post:
probabilities are not normalized ( less then 1). Because off this you can actually not take the limit as it is usually done. If you however rewrite the payoff by going through all the different possibilities and their payoffs for N-toss game and then take the limit, the paradox disappears.
In words, for k less than or equal to L-1, the min of 2 to the kth power and W is 2 raised to the k power. For k equal to L or greater, W is the minimum. Therefore we can split the sum into the two parts.