I’m dead sure you’d need more than ‘just more than a doubling’ for the payoff to make sense. Let’s assume two things.
Net utility naturally doubles for humans roughly every 300,000 years. (This is deliberately conservative, recent history would suggest something much faster, but the numbers are so stupidly asymmetric, using recent history would be silly. Homo Sapiens have been around that long, net utility has doubled at least once in that time)
The universe will experience heat death in roughly 10^100 years.
Before you even try to factor in the volatility costs, time value of enjoying that utility, etc. your payoff has to be something like 2^10^95.
Edit, alright since apparently we’re having trouble with this argument, let’s clarify it.
It’s not good enough for a bet to “make sense,” in some isolated fashion. You have to evaluate the opportunity cost of what you could have done with the thing you’re betting instead. My original comment was suggesting a method to evaluate that opportunity cost.
The post makes this weird “if the utility was just big enough” move, while still attempting to justify the original, incredibly stupid bet. It’s a bet. Pick a payoff scheme, and the math works, or it doesn’t, when compared to some opportunity cost, not some nonsensical bet from nowhere. Saying that the universe is big, and valuable, and vaguely pointing at a valuation method, but then pointing to “but just make the payout bigger” misses the point. Humans are bad at evaluating such structures, and using them to build your moral theories has issues.
For the coinflip to make sense, your opportunity cost has to approach zero. Give any reasonable argument that the universe existing has an opportunity cost approaching zero, and the bet gets interesting.
But almost any valuation method you pick for the universe gets to absurdly high ongoing value. That isn’t a Pascal’s Mugging, that’s deciding to bet the universe.
Here’s how you get to opportunity cost near zero:
X-Risk greater than 50%
Humans are the only sapient species. (Getting past X-risk gets evaluated per species, the universal coinflip gets evaluated for the universe. That changes the math.)
Your certainty on both 1 and 2 are so high that your fudge factor doesn’t play with the fact you know the odds of coinflips.
If any of those three is not true, you can’t get a low enough opportunity cost to justify the coinflip. That still might not be enough, but you’re at least getting into the ballpark of having a discussion about the bet being sensible.
If anyone wants to argue, instead of downvoting, I’d take the argument. Maybe I’m missing something. But it’s just a stupid bet without some method of evaluating opportunity cost. Pick one.
I’m assuming the Cosmic Flipper is offering, not a doubling of the universe’s current value, but a doubling of its current expected value (including whatever you think the future is worth) plus a little more. If it’s just doubling current niceness or something, then yeah, that’s not nearly enough.
If “expected” effectively means what you’re saying is that you’re being offered a bet that is good by definition, that even at 50⁄50 odds, you take the bet, I suppose that’s true. If the bet is static for a second flip, it wouldn’t be a good deal, but if it dynamically altered such that it was once again a good bet by definition, I suppose you keep taking the bet.
If you’re engaging with the uncertainty that people are bad at evaluating things like “expected utility” then at least some of the point is that our naive intuitions are probably missing some of the math, and costs, and the bet is likely a bad bet.
If I was trying to give credence to that second possibility, I’d say that the word “expected” is now doing a bunch of hidden heavy lifting in the payoff structure, and you don’t really know what lifting it’s doing.
I’m dead sure you’d need more than ‘just more than a doubling’ for the payoff to make sense. Let’s assume two things.
Net utility naturally doubles for humans roughly every 300,000 years. (This is deliberately conservative, recent history would suggest something much faster, but the numbers are so stupidly asymmetric, using recent history would be silly. Homo Sapiens have been around that long, net utility has doubled at least once in that time)
The universe will experience heat death in roughly 10^100 years.
Before you even try to factor in the volatility costs, time value of enjoying that utility, etc. your payoff has to be something like 2^10^95.
Edit, alright since apparently we’re having trouble with this argument, let’s clarify it.
It’s not good enough for a bet to “make sense,” in some isolated fashion. You have to evaluate the opportunity cost of what you could have done with the thing you’re betting instead. My original comment was suggesting a method to evaluate that opportunity cost.
The post makes this weird “if the utility was just big enough” move, while still attempting to justify the original, incredibly stupid bet. It’s a bet. Pick a payoff scheme, and the math works, or it doesn’t, when compared to some opportunity cost, not some nonsensical bet from nowhere. Saying that the universe is big, and valuable, and vaguely pointing at a valuation method, but then pointing to “but just make the payout bigger” misses the point. Humans are bad at evaluating such structures, and using them to build your moral theories has issues.
For the coinflip to make sense, your opportunity cost has to approach zero. Give any reasonable argument that the universe existing has an opportunity cost approaching zero, and the bet gets interesting.
But almost any valuation method you pick for the universe gets to absurdly high ongoing value. That isn’t a Pascal’s Mugging, that’s deciding to bet the universe.
Here’s how you get to opportunity cost near zero:
X-Risk greater than 50%
Humans are the only sapient species. (Getting past X-risk gets evaluated per species, the universal coinflip gets evaluated for the universe. That changes the math.)
Your certainty on both 1 and 2 are so high that your fudge factor doesn’t play with the fact you know the odds of coinflips.
If any of those three is not true, you can’t get a low enough opportunity cost to justify the coinflip. That still might not be enough, but you’re at least getting into the ballpark of having a discussion about the bet being sensible.
If anyone wants to argue, instead of downvoting, I’d take the argument. Maybe I’m missing something. But it’s just a stupid bet without some method of evaluating opportunity cost. Pick one.
I’m assuming the Cosmic Flipper is offering, not a doubling of the universe’s current value, but a doubling of its current expected value (including whatever you think the future is worth) plus a little more. If it’s just doubling current niceness or something, then yeah, that’s not nearly enough.
I’d missed that, thank you for pointing that out.
If “expected” effectively means what you’re saying is that you’re being offered a bet that is good by definition, that even at 50⁄50 odds, you take the bet, I suppose that’s true. If the bet is static for a second flip, it wouldn’t be a good deal, but if it dynamically altered such that it was once again a good bet by definition, I suppose you keep taking the bet.
If you’re engaging with the uncertainty that people are bad at evaluating things like “expected utility” then at least some of the point is that our naive intuitions are probably missing some of the math, and costs, and the bet is likely a bad bet.
If I was trying to give credence to that second possibility, I’d say that the word “expected” is now doing a bunch of hidden heavy lifting in the payoff structure, and you don’t really know what lifting it’s doing.