This isn’t about Jesus Christ, and it isn’t about schizophrenia. It isn’t even about religion. It’s about the Simulation Argument.
If we have good reason to believe that we will be reliably simulated many times in the future, than we can trivially conclude that we are almost certainly inone of the simulations.
This isn’t about Jesus Christ, and it isn’t about schizophrenia. It isn’t even about religion. It’s about the Simulation Argument.
Well what it’s about is obviously open to interpretation. But I do think there is a distinction between “Am I in the Matrix?” and “Am I insane?” For one thing, we KNOW (or do we?) that there are a lot of people out there who suffer from big-time delusions. There isn’t the same certainty about the existence of simulations.
For another, suppose you are presented with compelling evidence that you are in a simulation. e.g. the simulator shows up; tells you that’s it a simulation; defies the laws of physics; and gives you some “cheat codes” which seem to work reliably. In that case, a reasonable person would update the probability that he is in a simulation to make it a good deal higher. Unless of course he seriously doubted his sanity. So the question of sanity would seem even more fundamental than the simulation question.
A reasonable person would update both the probability that he is in a simulation and the probability that he is insane to be a good deal higher, at the expense of the hypothesis “I am sane and not in a simulation”. That fact probably wouldn’t be changed much if he doubted his sanity already.
A reasonable person would update both the probability that he is in a simulation and the probability that he is insane to be a good deal higher, at the expense of the hypothesis “I am sane and not in a simulation”.
Yes I agree. But if he already thinks there is a good chance he is insane, then it seems to me most of the extra probability will go to that hypothesis alone.
For example, if you think there is a 1 in 100 chance that you are in a simulation and a 1 in 10 chance that you are insane; and then you get a visit from Mr. Simulator, then arguably you should conclude that there is a very high probability you are insane, perhaps 90%. Anyway, the main point is that these two issues—insanity and simulation—can be conceptually separated to a large extent.
Well, it depends what you mean by “most of the extra probability”—a change from 50% to 60% probability represents a smaller change in perceived amount of evidence than from 1% to 5%. I think meeting one of the dark lords of the matrix should probably weigh more as evidence for being in a simulation than for being insane, or at worst it should be 50⁄50 for each hypothesis.
Certainly the concepts can be conceptually separated (unless you put more meaning into that than I’m seeing), although I object to calling the one question more fundamental than the other.
Well, it depends what you mean by “most of the extra probability”—a change from 50% to 60% probability represents a smaller change in perceived amount of evidence than from 1% to 5%. I think meeting one of the dark lords of the matrix should probably weigh more as evidence for being in a simulation than for being insane, or at worst it should be 50⁄50 for each hypothesis.
I disagree, although admittedly I am too lazy to do the actual calculation. Basically you can divide things up into 3 possibilities: (1) you are sane and not in a simulation; (2) you are sane and in a simulation; and (3) you are insane and not in a simulation. (Another possibility is that you are both insane AND in a simulation, but using the probabilities I assigned, this is sufficiently unlikely that I will ignore it.)
As noted above, if you are visited by Mr. Simulator, the probability of (1) goes from high to basically zero and that amount will be distributed between (2) and (3). To determine how much goes to each, I think you need to reverse the conditional probabilities. So, (1) assuming that you are insane, what is the probability of perceiving a visit by Mr. Simulator; and (2) assuming that you are in a simulation, what is the probability of being visited by Mr. Simulator? Both of these are pretty low and I don’t see any reason to believe that one is a good deal higher than the other. So my intuition is that the insanity hypothesis is roughly as favored as before vis-a-vis the simulation hypothesis, i.e. much more likely based on my assumptions.
Certainly the concepts can be conceptually separated (unless you put more meaning into that than I’m seeing), although I object to calling the one question more fundamental than the other.
Well that’s just a matter of semantics, but let me ask you this: Who is more likely to have a shot at developing a decent mental model of the universe? Someone who is delusional or someone who is in the Matrix? Both will have a difficult time of it but the former situation is basically hopeless.
What I meant by “50/50 for each hypothesis” was the conditional probabilities for each being the same, so I don’t think we disagree very much there, except that I intuitively feel that meeting the simulator and having him try to convince you you’re in a simulation really should be evidence that prefers the hypothesis that you’re in a simulation.
This isn’t about Jesus Christ, and it isn’t about schizophrenia. It isn’t even about religion. It’s about the Simulation Argument.
If we have good reason to believe that we will be reliably simulated many times in the future, than we can trivially conclude that we are almost certainly inone of the simulations.
Well what it’s about is obviously open to interpretation. But I do think there is a distinction between “Am I in the Matrix?” and “Am I insane?” For one thing, we KNOW (or do we?) that there are a lot of people out there who suffer from big-time delusions. There isn’t the same certainty about the existence of simulations.
For another, suppose you are presented with compelling evidence that you are in a simulation. e.g. the simulator shows up; tells you that’s it a simulation; defies the laws of physics; and gives you some “cheat codes” which seem to work reliably. In that case, a reasonable person would update the probability that he is in a simulation to make it a good deal higher. Unless of course he seriously doubted his sanity. So the question of sanity would seem even more fundamental than the simulation question.
A reasonable person would update both the probability that he is in a simulation and the probability that he is insane to be a good deal higher, at the expense of the hypothesis “I am sane and not in a simulation”. That fact probably wouldn’t be changed much if he doubted his sanity already.
Yes I agree. But if he already thinks there is a good chance he is insane, then it seems to me most of the extra probability will go to that hypothesis alone.
For example, if you think there is a 1 in 100 chance that you are in a simulation and a 1 in 10 chance that you are insane; and then you get a visit from Mr. Simulator, then arguably you should conclude that there is a very high probability you are insane, perhaps 90%. Anyway, the main point is that these two issues—insanity and simulation—can be conceptually separated to a large extent.
Well, it depends what you mean by “most of the extra probability”—a change from 50% to 60% probability represents a smaller change in perceived amount of evidence than from 1% to 5%. I think meeting one of the dark lords of the matrix should probably weigh more as evidence for being in a simulation than for being insane, or at worst it should be 50⁄50 for each hypothesis.
Certainly the concepts can be conceptually separated (unless you put more meaning into that than I’m seeing), although I object to calling the one question more fundamental than the other.
I disagree, although admittedly I am too lazy to do the actual calculation. Basically you can divide things up into 3 possibilities: (1) you are sane and not in a simulation; (2) you are sane and in a simulation; and (3) you are insane and not in a simulation. (Another possibility is that you are both insane AND in a simulation, but using the probabilities I assigned, this is sufficiently unlikely that I will ignore it.)
As noted above, if you are visited by Mr. Simulator, the probability of (1) goes from high to basically zero and that amount will be distributed between (2) and (3). To determine how much goes to each, I think you need to reverse the conditional probabilities. So, (1) assuming that you are insane, what is the probability of perceiving a visit by Mr. Simulator; and (2) assuming that you are in a simulation, what is the probability of being visited by Mr. Simulator? Both of these are pretty low and I don’t see any reason to believe that one is a good deal higher than the other. So my intuition is that the insanity hypothesis is roughly as favored as before vis-a-vis the simulation hypothesis, i.e. much more likely based on my assumptions.
Well that’s just a matter of semantics, but let me ask you this: Who is more likely to have a shot at developing a decent mental model of the universe? Someone who is delusional or someone who is in the Matrix? Both will have a difficult time of it but the former situation is basically hopeless.
What I meant by “50/50 for each hypothesis” was the conditional probabilities for each being the same, so I don’t think we disagree very much there, except that I intuitively feel that meeting the simulator and having him try to convince you you’re in a simulation really should be evidence that prefers the hypothesis that you’re in a simulation.