I’ll give a 50% chance that I’ll experience that. (One copy of me continues in the “real” world, another copy of me appears in a simulation and goes bowling.)
(If you ask this question as “the AI is going to run N copies of the bowling simulation”, then I’m not sure how to answer—I’m not sure how to weight N copies of the exact same experience. My intuition is that I should still give a 50% chance, unless the simulations are going to differ in some respect, then I’d give a N/(N+1) chance.)
I need to think about your answer, as right now it doesn’t make any sense to me. I suspect that whatever intuition underlies it is the source of our disagreement/confusion.
@linkhyrule5 had an answer better than the one I had in mind. The probability of us going bowling together is approximately equal to the probability that you are already in said simulation, if computational continuity is what matters.
If there were a 6th Day like service I could sign up for where if anything were to happen to me, a clone/simulation of with my memories would be created, I’d sign up for it in a heartbeat. Because if something were to happen to me I wouldn’t want to deprive my wife of her husband, or my daughters of their father. But that is purely altruistic: I would have P(~0) expectation that I would actually experience that resurrection. Rather, some doppelganger twin that in every outward way behaves like me will take up my life where I left off. And that’s fine, but let’s be clear about the difference.
If you are not the simulation the AI was referring to, then you and it will not go bowling together, period. Because when said bowling occurs, you’ll be dead. Or maybe you’ll be alive and well and off doing other things while the simulation is going on. But under no circumstances should you expect to wake up as the simulation, as we are assuming them to be causally separate.
At least from my way of thinking. I’m not sure I understand yet where you are coming from well enough to predict what you’d expect to experience.
@linkhyrule5 had an answer better than the one I had in mind. The probability of us going bowling together is approximately equal to the probability that you are already in said simulation, if computational continuity is what matters.
You could understand my 50% answer to be expressing my uncertainty as to whether I’m in the simulation or not. It’s the same thing.
I don’t understand what “computational continuity” means. Can you explain it using a program that computes the digits of pi as an example?
Rather, some doppelganger twin that in every outward way behaves like me will take up my life where I left off. And that’s fine, but let’s be clear about the difference.
I think you’re making a distinction that exists only in the map, not in the territory. Can you point to something in the territory that this matters for?
I’ll give a 50% chance that I’ll experience that. (One copy of me continues in the “real” world, another copy of me appears in a simulation and goes bowling.)
(If you ask this question as “the AI is going to run N copies of the bowling simulation”, then I’m not sure how to answer—I’m not sure how to weight N copies of the exact same experience. My intuition is that I should still give a 50% chance, unless the simulations are going to differ in some respect, then I’d give a N/(N+1) chance.)
I need to think about your answer, as right now it doesn’t make any sense to me. I suspect that whatever intuition underlies it is the source of our disagreement/confusion.
@linkhyrule5 had an answer better than the one I had in mind. The probability of us going bowling together is approximately equal to the probability that you are already in said simulation, if computational continuity is what matters.
If there were a 6th Day like service I could sign up for where if anything were to happen to me, a clone/simulation of with my memories would be created, I’d sign up for it in a heartbeat. Because if something were to happen to me I wouldn’t want to deprive my wife of her husband, or my daughters of their father. But that is purely altruistic: I would have P(~0) expectation that I would actually experience that resurrection. Rather, some doppelganger twin that in every outward way behaves like me will take up my life where I left off. And that’s fine, but let’s be clear about the difference.
If you are not the simulation the AI was referring to, then you and it will not go bowling together, period. Because when said bowling occurs, you’ll be dead. Or maybe you’ll be alive and well and off doing other things while the simulation is going on. But under no circumstances should you expect to wake up as the simulation, as we are assuming them to be causally separate.
At least from my way of thinking. I’m not sure I understand yet where you are coming from well enough to predict what you’d expect to experience.
You could understand my 50% answer to be expressing my uncertainty as to whether I’m in the simulation or not. It’s the same thing.
I don’t understand what “computational continuity” means. Can you explain it using a program that computes the digits of pi as an example?
I think you’re making a distinction that exists only in the map, not in the territory. Can you point to something in the territory that this matters for?