I think he means “as opposed to living in a simulation (possibly in another simulation, and so on)”
This seems to be one of those questions that seem like they should answer, but actually don’t.
If there’s at least one copy of you in “a simulation” and at least one in “base level reality”, then you’re going to run into the same problems as sleeping beauty/absent minded driver/etc when you deal with ‘indexical probabilities’.
There are Decicion Theory answers, but the ones that work don’t mention indexical probabilities. This does make the situation a bit harder than say, the sleeping beauty problem, since you have to figure out how to weight your utility function over multiple universes.
I think he means “as opposed to living in a simulation (possibly in another simulation, and so on)”
This seems to be one of those questions that seem like they should answer, but actually don’t.
If there’s at least one copy of you in “a simulation” and at least one in “base level reality”, then you’re going to run into the same problems as sleeping beauty/absent minded driver/etc when you deal with ‘indexical probabilities’.
There are Decicion Theory answers, but the ones that work don’t mention indexical probabilities. This does make the situation a bit harder than say, the sleeping beauty problem, since you have to figure out how to weight your utility function over multiple universes.