But is it rational to entertain theories about differences in external reality that could never make any difference to subjective or objective experience?
I value other minds existing to interact with me, even if I can’t perceive them directly. And I value waking up tomorrow in the same universe (more or less) that I’m in now.
Is this rational? Eliezer defines rationality as systematized winning; I’m pointing out what.
Under DT, and MWI, which are not the same, you wake up in all the universes you were ever in.
ETA
You might have a concern about your measure being dominated by simulations.That isnt the same as jumping. Also, you can only be simulated if you ever had a real life, so it’s possible to take the glass half full view, that the simulations are a bonus to a fully real life, not a dilution.
If it’s rational to prefer your perceptions to conform to an external reality, than it’s rational to not want to be someone else every morning.
But is it rational to entertain theories about differences in external reality that could never make any difference to subjective or objective experience?
I value other minds existing to interact with me, even if I can’t perceive them directly. And I value waking up tomorrow in the same universe (more or less) that I’m in now.
Is this rational? Eliezer defines rationality as systematized winning; I’m pointing out what.
Under DT, and MWI, which are not the same, you wake up in all the universes you were ever in.
ETA
You might have a concern about your measure being dominated by simulations.That isnt the same as jumping. Also, you can only be simulated if you ever had a real life, so it’s possible to take the glass half full view, that the simulations are a bonus to a fully real life, not a dilution.