Suppose someone offers you the chance to play the following game:
You are given an initial stake of $1. A fair coin is flipped. If the result is TAILS, you keep the current stake. If the result is HEADS, the stake doubles and the coin is flipped again, repeating the process.
How much money should you be willing to pay to play this game?
1 flip --- $1 probability 1/2
2 flips -- $2 probability 1/4
3 flips -- $4 probability 1/8
4 flips -- $8 probability 1/16
...
The expected value doesn’t converge but it grows extremely slowly, where almost all the benefit comes from an extremely tiny chance of extremely large gain. The obvious question is counterparty risk: how much do you trust the person offering the game to actually be able to follow through with what they offered?
If we think of this as a sum over coin flips, each flip you think is possible gives another $0.50 in expected value. So if you think they’re probably only good for amounts up to $1M then because it takes 20 flips to pass $1M the expected value is $0.50 * 19 or $9.50. Similarly if you think they’re good for $1B then that’s 29 flips max for an expected value of $14.50. You could be fancy and try to model your uncertainty about how much they’re good for, but that’s probably not worth it. And you do want to take into account that someone offering something like this with no provision for how they’ll handle extremely large payouts is probably not entirely on the level.
Expected value is also not the right metric here, since we all have diminishing marginal returns. Would you enjoy $1B 1,000x as much as $1M? Even if you’re giving your winnings to charity there are still some limits to our ability to effectively use additional donations.
Short answer: $5. (This trusts them to be good for $1024, and is in a range where utility should still be pretty much linear in money.)
I was thinking that you should take into account the fact that if you got several trillion dollars, that only entitled you to half of America’s resources, and if you got infinite dollars it would only give you 100% of America’s resources. It turns out that similar notions have already been studied and the expected value calculated for them on Wikipedia (well, they just assumed that the bankroll was US GDP and didn’t look at a quantity theory of money solution specifically, but same diff).
As formulated, zero—under the rules you posted you never win anything. Is there an unstated assumption that you can stop the game at any time and exit with your stake?
I guess I didn’t formulate the rules clearly enough—if the coin lands on tails, you exit with the stake. For example, if you play and the sequence is HEADS → HEADS → TAILS, you exit with $4. The game only ends when tails is flipped.
Also notice that as formulated (“You are given an initial stake of $1”) you don’t have any of your own money at risk, so… And if the game only ends when TAILS is flipped, there is no way to lose, is there?
If the first $1 comes from you, you are basically asking about the “double till you win” strategy. You might be interested in reading about the St.Petersburg paradox.
Reading the wikipedia article on the St Petersburg paradox, that’s exactly the game tetronian2 has described.
A casino offers a game of chance for a single player in which a fair coin is tossed at each stage. The pot starts at 2 dollars and is doubled every time a head appears. The first time a tail appears, the game ends and the player wins whatever is in the pot. Thus the player wins 2 dollars if a tail appears on the first toss, 4 dollars if a head appears on the first toss and a tail on the second, 8 dollars if a head appears on the first two tosses and a tail on the third, 16 dollars if a head appears on the first three tosses and a tail on the fourth, and so on. In short, the player wins 2k dollars, where k equals number of tosses (k must be a whole number and greater than zero). What would be a fair price to pay the casino for entering the game?
Yep. I don’t think I was ever aware of the name; someone threw this puzzle at me in a job interview a while ago, so I figured I’d post it here for fun.
The money that’s “at stake” is the amount you spend to play the game. Once the game begins, you get 2^(n) dollars, where n is the number of successive heads you flip.
Suppose someone offers you the chance to play the following game:
You are given an initial stake of $1. A fair coin is flipped. If the result is TAILS, you keep the current stake. If the result is HEADS, the stake doubles and the coin is flipped again, repeating the process.
How much money should you be willing to pay to play this game?
Outcomes:
The expected value doesn’t converge but it grows extremely slowly, where almost all the benefit comes from an extremely tiny chance of extremely large gain. The obvious question is counterparty risk: how much do you trust the person offering the game to actually be able to follow through with what they offered?
If we think of this as a sum over coin flips, each flip you think is possible gives another $0.50 in expected value. So if you think they’re probably only good for amounts up to $1M then because it takes 20 flips to pass $1M the expected value is $0.50 * 19 or $9.50. Similarly if you think they’re good for $1B then that’s 29 flips max for an expected value of $14.50. You could be fancy and try to model your uncertainty about how much they’re good for, but that’s probably not worth it. And you do want to take into account that someone offering something like this with no provision for how they’ll handle extremely large payouts is probably not entirely on the level.
Expected value is also not the right metric here, since we all have diminishing marginal returns. Would you enjoy $1B 1,000x as much as $1M? Even if you’re giving your winnings to charity there are still some limits to our ability to effectively use additional donations.
Short answer: $5. (This trusts them to be good for $1024, and is in a range where utility should still be pretty much linear in money.)
I was thinking that you should take into account the fact that if you got several trillion dollars, that only entitled you to half of America’s resources, and if you got infinite dollars it would only give you 100% of America’s resources. It turns out that similar notions have already been studied and the expected value calculated for them on Wikipedia (well, they just assumed that the bankroll was US GDP and didn’t look at a quantity theory of money solution specifically, but same diff).
As formulated, zero—under the rules you posted you never win anything. Is there an unstated assumption that you can stop the game at any time and exit with your stake?
I guess I didn’t formulate the rules clearly enough—if the coin lands on tails, you exit with the stake. For example, if you play and the sequence is HEADS → HEADS → TAILS, you exit with $4. The game only ends when tails is flipped.
Also notice that as formulated (“You are given an initial stake of $1”) you don’t have any of your own money at risk, so… And if the game only ends when TAILS is flipped, there is no way to lose, is there?
If the first $1 comes from you, you are basically asking about the “double till you win” strategy. You might be interested in reading about the St.Petersburg paradox.
Reading the wikipedia article on the St Petersburg paradox, that’s exactly the game tetronian2 has described.
Yep. I don’t think I was ever aware of the name; someone threw this puzzle at me in a job interview a while ago, so I figured I’d post it here for fun.
The money that’s “at stake” is the amount you spend to play the game. Once the game begins, you get 2^(n) dollars, where n is the number of successive heads you flip.