Ruminating further, I think I’ve narrowed down the region where the fallacious step occurs.
Suppose there are 100 simulacra, and suppose for each simulacrum you flip a coin biased 9:1 in favor of heads. You choose one of two actions for each simulacrum, depending on whether the coin shows heads or tails, but the two actions have equal net utility for the simulacra so there are no moral conundra. Now, even though the combination of 90 heads and 10 tails is the most common, the permutations comprising it are nonetheless vastly outnumbered by all the remaining permutations. Suppose that after flipping 100 biased coins, the actual result is 85 heads and 15 tails.
What is the subjective probability? The coin flips are independent events, so the subjective probability of each coin flip must be 9:1 favoring heads. The fact that only 85 simulacra actually experienced heads is completely irrelevant.
Subjective probability arises from knowledge, so in practice none of the simulacra experience a subjective probability after a single coin flip. If the coin flip is repeated multiple times for all simulacra, then as each simulacrum experiences more coin flips while iterating through its state function, it will gradually converge on the objective probability of 90%. The first coin flip merely biases the experience of each simulacrum, determining the direction from which each will converge on the limit.
That said, take what I say with a grain of salt, because I seriously doubt this can be extended from the classical realm to cover quantum simulacra and the Born rule.
And, since I can’t let that stand without tangling myself up in Yudkowsky’s “Outlawing Anthropics” post, I’ll present my conclusion on that as well:
To recapitulate the scenario: Suppose 20 copies of me are created and go to sleep, and a fair coin is tossed. If heads, 18 go to green rooms and 2 go to red rooms; if tails, vice versa. Upon waking, each of the copies in green rooms will be asked “Give $1 to each copy in a green room, while taking $3 from each copy in a red room”? (All must agree or something sufficiently horrible happens.)
The correct answer is “no”. Because I have copies and I am interacting with them, it is not proper for me to infer from my green room that I live in heads-world with 90% probability. Rather, there is 100% certainty that at least 2 of me are living in a green room, and if I am one of them, then the odds are 50-50 whether I have 1 companion or 17. I must not change my answer if I value my 18 potential copies in red rooms.
However, suppose there were only one of me instead. There is still a coin flip, and there are still 20 rooms (18 green/red and 2 red/green, depending on the flip), but I am placed into one of the rooms at random. Now, I wake in a green room, and I am asked a slightly different question: “Would you bet the coin was heads? Win +$1, or lose -$3”. My answer is now “yes”: I am no longer interacting with copies, the expected utility is +$0.60, so I take the bet.
The stuff about Boltzmann brains is a false dilemma. There’s no point in valuing the Boltzmann brain scenario over any of the other “trapped in the Matrix” / “brain in a jar” scenarios, of which there is a limitless supply. See, for instance, this lecture from Lawrence Krauss -- the relevant bits are from 0:24:00 to 0:41:00 -- which gives a much simpler explanation for why the universe began with low entropy, and doesn’t tie itself into loops by supposing Boltzmann pocket universes embedded in a high-entropy background universe.
Ruminating further, I think I’ve narrowed down the region where the fallacious step occurs.
Suppose there are 100 simulacra, and suppose for each simulacrum you flip a coin biased 9:1 in favor of heads. You choose one of two actions for each simulacrum, depending on whether the coin shows heads or tails, but the two actions have equal net utility for the simulacra so there are no moral conundra. Now, even though the combination of 90 heads and 10 tails is the most common, the permutations comprising it are nonetheless vastly outnumbered by all the remaining permutations. Suppose that after flipping 100 biased coins, the actual result is 85 heads and 15 tails.
What is the subjective probability? The coin flips are independent events, so the subjective probability of each coin flip must be 9:1 favoring heads. The fact that only 85 simulacra actually experienced heads is completely irrelevant.
Subjective probability arises from knowledge, so in practice none of the simulacra experience a subjective probability after a single coin flip. If the coin flip is repeated multiple times for all simulacra, then as each simulacrum experiences more coin flips while iterating through its state function, it will gradually converge on the objective probability of 90%. The first coin flip merely biases the experience of each simulacrum, determining the direction from which each will converge on the limit.
That said, take what I say with a grain of salt, because I seriously doubt this can be extended from the classical realm to cover quantum simulacra and the Born rule.
And, since I can’t let that stand without tangling myself up in Yudkowsky’s “Outlawing Anthropics” post, I’ll present my conclusion on that as well:
To recapitulate the scenario: Suppose 20 copies of me are created and go to sleep, and a fair coin is tossed. If heads, 18 go to green rooms and 2 go to red rooms; if tails, vice versa. Upon waking, each of the copies in green rooms will be asked “Give $1 to each copy in a green room, while taking $3 from each copy in a red room”? (All must agree or something sufficiently horrible happens.)
The correct answer is “no”. Because I have copies and I am interacting with them, it is not proper for me to infer from my green room that I live in heads-world with 90% probability. Rather, there is 100% certainty that at least 2 of me are living in a green room, and if I am one of them, then the odds are 50-50 whether I have 1 companion or 17. I must not change my answer if I value my 18 potential copies in red rooms.
However, suppose there were only one of me instead. There is still a coin flip, and there are still 20 rooms (18 green/red and 2 red/green, depending on the flip), but I am placed into one of the rooms at random. Now, I wake in a green room, and I am asked a slightly different question: “Would you bet the coin was heads? Win +$1, or lose -$3”. My answer is now “yes”: I am no longer interacting with copies, the expected utility is +$0.60, so I take the bet.
The stuff about Boltzmann brains is a false dilemma. There’s no point in valuing the Boltzmann brain scenario over any of the other “trapped in the Matrix” / “brain in a jar” scenarios, of which there is a limitless supply. See, for instance, this lecture from Lawrence Krauss -- the relevant bits are from 0:24:00 to 0:41:00 -- which gives a much simpler explanation for why the universe began with low entropy, and doesn’t tie itself into loops by supposing Boltzmann pocket universes embedded in a high-entropy background universe.