How do I observe this anthropic measure—how can I make any guesses about what the outside observer would see?
The same way you’d make such guesses normally—observe the world, build an implicit model, make interpretations etc. “How” is not really an additional problem, so perhaps you’d like examples and motivation.
Suppose that I flip a quantum coin, and if it lands heads I give you cake and tails I don’t—you expect to get cake with 50% probability. Similarly, if you start with 1 unit of anthropic measure, it gets split between cake and no-cake 0.5 to 0.5. Everything is ordinary.
However, consider the case where you get no cake, but I run a perfect simulation of you in which you get cake in the near future. At some point after the simulation has started, your proper probability assignment is 50% that you’ll get cake and 50% that you won’t, just like in the quantum coin flip. But now, if you start with 1 unit of anthropic measure, your measure never changes—instead a simulation is started in the same universe that also gets 1 unit of measure!
If all we cared about in decision-making was probabilities, we’d treat these two cases the same (e.g. you’d pay the same amount to make either happen). But if we also care about anthropic measure, then we will probably prefer one over the other.
It’s also important to keep track of anthropic measure as an intermediate step to getting probabilities in nontrivial cases like the Sleeping Beauty problem. If you only track probabilities, you end up normalizing too soon and too often.
Sorry, but above doesn’t clarify anything for me. I may accept that the concept of probability is out of the scope here, that bayesianism doesn’t work for guessing whether one is or isn’t in a certain simulation, but I don’t know if that’s what you meant.
I mean something a bit more complicated—that probability is working fine and giving sensible answers, but that when probability measure and anthropic measure diverge, probabilities no longer fit into decision-making into a simple way, even though they still really do reflect your state of knowledge.
There are many kinks in what a better system would actually be, and hopefully I’ll eventually work out some kinks and write up a post.
The same way you’d make such guesses normally—observe the world, build an implicit model, make interpretations etc. “How” is not really an additional problem, so perhaps you’d like examples and motivation.
Suppose that I flip a quantum coin, and if it lands heads I give you cake and tails I don’t—you expect to get cake with 50% probability. Similarly, if you start with 1 unit of anthropic measure, it gets split between cake and no-cake 0.5 to 0.5. Everything is ordinary.
However, consider the case where you get no cake, but I run a perfect simulation of you in which you get cake in the near future. At some point after the simulation has started, your proper probability assignment is 50% that you’ll get cake and 50% that you won’t, just like in the quantum coin flip. But now, if you start with 1 unit of anthropic measure, your measure never changes—instead a simulation is started in the same universe that also gets 1 unit of measure!
If all we cared about in decision-making was probabilities, we’d treat these two cases the same (e.g. you’d pay the same amount to make either happen). But if we also care about anthropic measure, then we will probably prefer one over the other.
It’s also important to keep track of anthropic measure as an intermediate step to getting probabilities in nontrivial cases like the Sleeping Beauty problem. If you only track probabilities, you end up normalizing too soon and too often.
I mean something a bit more complicated—that probability is working fine and giving sensible answers, but that when probability measure and anthropic measure diverge, probabilities no longer fit into decision-making into a simple way, even though they still really do reflect your state of knowledge.
There are many kinks in what a better system would actually be, and hopefully I’ll eventually work out some kinks and write up a post.