At this point I will admit that my gambling days were focused on poker, and Kelly isn’t very useful for that.
But here’s the formula as I understand it: EV/odds = edge, where odds is expressed as a multiple of one. So for the coinflip case we’re disagreeing about, EV is .02, odds are 1, so you bet .02.
If instead you had a coinflip with a fair coin where you were paid $2 on a win and lose $1 on a loss, your EV is .5/flip, odds are 2, so bet 25%.
Right, so this is basically the “net expected value divided by net win value” formula which I examine in the post, because “odds” is the same as b, net winnings per dollar invested. (Double-or-nothing means odds of 1, triple-or-nothing means odds of 2, etc.)
At this point I will admit that my gambling days were focused on poker, and Kelly isn’t very useful for that.
Yeah? How do you decide how much you’re willing to bet on a particular hand? Kelly isn’t relevant at all?
It’s too cumbersome and only addresses part of the issue. Kelly more or less assumes that you make a bet, it gets resolved, now you can make the next bet. But in poker, with multiple streets, you have to think about a sequence of bets based on some distribution of opponent actions and new information.
Also with Kelly you don’t usually have to think about how the size of your bet influences your likelihood to win, but in poker the amount that you bluff both changes the probability of the bluff being successful (people call less when you bet more) but also the amount you lose if you’re wrong. Or if you value bet (meaning you want to get called) then if you bet more they call less but you win more when they call. Again, vanilla Kelly doesn’t really work.
I imagine it could be extended, but instead people have built more specialized frameworks for thinking about it that combine game theory with various stats/probability tools like Kelly.
The Math of Poker, written by a couple of friends of mine, might be a fun read if you’re interested. It probably won’t help you to become a better poker player, but the math is good fun.
At this point I will admit that my gambling days were focused on poker, and Kelly isn’t very useful for that.
But here’s the formula as I understand it: EV/odds = edge, where odds is expressed as a multiple of one. So for the coinflip case we’re disagreeing about, EV is .02, odds are 1, so you bet .02.
If instead you had a coinflip with a fair coin where you were paid $2 on a win and lose $1 on a loss, your EV is .5/flip, odds are 2, so bet 25%.
Right, so this is basically the “net expected value divided by net win value” formula which I examine in the post, because “odds” is the same as b, net winnings per dollar invested. (Double-or-nothing means odds of 1, triple-or-nothing means odds of 2, etc.)
Yeah? How do you decide how much you’re willing to bet on a particular hand? Kelly isn’t relevant at all?
It’s too cumbersome and only addresses part of the issue. Kelly more or less assumes that you make a bet, it gets resolved, now you can make the next bet. But in poker, with multiple streets, you have to think about a sequence of bets based on some distribution of opponent actions and new information.
Also with Kelly you don’t usually have to think about how the size of your bet influences your likelihood to win, but in poker the amount that you bluff both changes the probability of the bluff being successful (people call less when you bet more) but also the amount you lose if you’re wrong. Or if you value bet (meaning you want to get called) then if you bet more they call less but you win more when they call. Again, vanilla Kelly doesn’t really work.
I imagine it could be extended, but instead people have built more specialized frameworks for thinking about it that combine game theory with various stats/probability tools like Kelly.
The Math of Poker, written by a couple of friends of mine, might be a fun read if you’re interested. It probably won’t help you to become a better poker player, but the math is good fun.
Thanks!