I’ve noticed that I’m no longer confused about anthropics, and a prediction-market based approach works.
Postulate. Anticipating (expecting) something is only relevant to decision making (for instance, expected utility calculation).
Expecting something can be represented by betting on a prediction market (with large enough liquidity so that it doesn’t move and contains no trade history).
If merging copies is considered, the sound probability to expect depends on merging algorithm. If it sums purchased shares across all copies, then the probability is influenced by splitting; if all copies except one are ignored, then not.
If copies are not merged, then what to anticipate depends on utility function.
“quantum suicide” aka rewriting arbitrary parts of utility function with zeroes is possible but don’t you really care about the person in unwanted scenario? Also, if AGI gets to know that, it can also run arbitrarily risky experiments...
Sleeping Beauty: if both trades go through in the case she is woken up twice, she should bet at probability 1⁄3. If not (for example, living the future: this opportunity will be presented to her only once), it’s coherent to bet at probability 1⁄2.
I’ve heard a comment that betting odds is something different from probability:
… what makes you think it [probability] should have a use? You can feel sure something will happen, or unsure about it, whether or not that has a use.
Well, if you feel sure about an event with incorrect probability, you may end up in suboptimal state with respect to instrumental rationality (since expected utility calculations will be flawed), so it’s perhaps more useful to have correct intuitions. (Eliezer may want to check this out and make fun of people with incorrect intuitions, by the way :-))
Yes, Sleeping Beauty has to account for the fact that, even if the result of the coin flip was such that she’s being woken up on both Monday and Tuesday, if she bets on it being Monday, she will surely lose one of the two times. So she needs an extra dollar in the pot from the counterparty: betting $1 to $2 rather than $1 to $1. That pays for the loss when she makes the same bet on Tuesday. In expectation this is a fair bet: she either puts $1 in the pot and loses it, or puts $1 in the pot and gets $3 and then puts $1 in the pot and loses it, getting $2 total.
Anyway, feeling something is an action. I think it’s a mistake when people take “anticipation” as primary. Sure, “Make Beliefs Pay Rent (In Anticipated Experiences)” is good advice, in a similar way as a guide to getting rich is good advice. Predictive beliefs, like money, are good to pursue on general principle, even before you know what you’re going to use them for. But my anticipations of stuff is good for me to the extent that the consequences of anticipating it are good for me. Like any other action.
I’ve noticed that I’m no longer confused about anthropics, and a prediction-market based approach works.
Postulate. Anticipating (expecting) something is only relevant to decision making (for instance, expected utility calculation).
Expecting something can be represented by betting on a prediction market (with large enough liquidity so that it doesn’t move and contains no trade history).
If merging copies is considered, the sound probability to expect depends on merging algorithm. If it sums purchased shares across all copies, then the probability is influenced by splitting; if all copies except one are ignored, then not.
If copies are not merged, then what to anticipate depends on utility function.
“quantum suicide” aka rewriting arbitrary parts of utility function with zeroes is possible but don’t you really care about the person in unwanted scenario? Also, if AGI gets to know that, it can also run arbitrarily risky experiments...
Sleeping Beauty: if both trades go through in the case she is woken up twice, she should bet at probability 1⁄3. If not (for example, living the future: this opportunity will be presented to her only once), it’s coherent to bet at probability 1⁄2.
I’ve heard a comment that betting odds is something different from probability:
Well, if you feel sure about an event with incorrect probability, you may end up in suboptimal state with respect to instrumental rationality (since expected utility calculations will be flawed), so it’s perhaps more useful to have correct intuitions. (Eliezer may want to check this out and make fun of people with incorrect intuitions, by the way :-))
New problems are welcome!
Yes, Sleeping Beauty has to account for the fact that, even if the result of the coin flip was such that she’s being woken up on both Monday and Tuesday, if she bets on it being Monday, she will surely lose one of the two times. So she needs an extra dollar in the pot from the counterparty: betting $1 to $2 rather than $1 to $1. That pays for the loss when she makes the same bet on Tuesday. In expectation this is a fair bet: she either puts $1 in the pot and loses it, or puts $1 in the pot and gets $3 and then puts $1 in the pot and loses it, getting $2 total.
Anyway, feeling something is an action. I think it’s a mistake when people take “anticipation” as primary. Sure, “Make Beliefs Pay Rent (In Anticipated Experiences)” is good advice, in a similar way as a guide to getting rich is good advice. Predictive beliefs, like money, are good to pursue on general principle, even before you know what you’re going to use them for. But my anticipations of stuff is good for me to the extent that the consequences of anticipating it are good for me. Like any other action.