The utility of leaving the (hypothetical) simulation depends on lots of facts about the “real” world. Since I’m ignorant of these facts, the expected utility of leaving strongly depends on my prior for what the outer world is like. They obviously have simulation technology, but they’re using it to create a severely inadequate virtual world. On the other hand, they provide an escape hatch, so I’m not being held here against my will and probably (?) entered voluntarily. Maybe this is a Roy Parsons scenario where bored people try playing life on hard mode?
If I willingly entered a simulation I knew I would probably not later opt out of, I assume it’s because the baser level of reality sucks worse than this one.
Your question has me feeling dense as I try to parse it in responding, especially the bolded words. “This” meaning the entire conditional scenario I stated, or the fact of my assumption, or the facts assumed? “It” meaning what? Can you re-ask the question?
Would this still be the case if it was a copy as opposed to a move?
Let me try to rephrase. Two questions, one of which is hopefully just a restatement of your original comment:
If you had the option to move your consciousness from reality to a simulation that you knew was worse than reality, knowing that you probably later wouldn’t opt out and return to reality, would you do so?
If you had the option to copy your consciousness from reality to a simulation that you knew was worse than reality, knowing that you probably later wouldn’t opt out and return to reality, would you do so?
The utility of leaving the (hypothetical) simulation depends on lots of facts about the “real” world. Since I’m ignorant of these facts, the expected utility of leaving strongly depends on my prior for what the outer world is like. They obviously have simulation technology, but they’re using it to create a severely inadequate virtual world. On the other hand, they provide an escape hatch, so I’m not being held here against my will and probably (?) entered voluntarily. Maybe this is a Roy Parsons scenario where bored people try playing life on hard mode?
If I willingly entered a simulation I knew I would probably not later opt out of, I assume it’s because the baser level of reality sucks worse than this one.
Would this still be the case if it was a copy as opposed to a move?
Your question has me feeling dense as I try to parse it in responding, especially the bolded words. “This” meaning the entire conditional scenario I stated, or the fact of my assumption, or the facts assumed? “It” meaning what? Can you re-ask the question?
Let me try to rephrase. Two questions, one of which is hopefully just a restatement of your original comment:
If you had the option to move your consciousness from reality to a simulation that you knew was worse than reality, knowing that you probably later wouldn’t opt out and return to reality, would you do so?
If you had the option to copy your consciousness from reality to a simulation that you knew was worse than reality, knowing that you probably later wouldn’t opt out and return to reality, would you do so?