In quantum copying and merging, every “branch” operation preserves the total measure of the original branch,
Maybe branch quantum operations don’t make new copies, but represent already existing but identical copies “becoming” no longer identical?
In the computer program analogy: instead of having one program at time t and n slightly different versions at time t+1, start out with n copies already existing (but identical) at time t, and have each one change in the branching. If you expect a t+2, you need to start with at least n^2 copies.
(That may mean a lot more copies of everything than would otherwise be expected even under many worlds, but even if it’s enough to give this diabolical monster bed-wetting nightmares, by the Occam’s razor that works for predicting physical laws, that’s absolutely fine).
Come to think of it… if this interpretation isn’t true...
or for that matter, even if this is true but it isn’t true that someone who runs redundantly on three processors gets three times as much weight as someone who runs on one processor...
then wouldn’t we be vastly likely to be experiencing the last instant of experienceable existence in the Universe, because that’s where the vast majority of distinct observers would be? Omega-point simulation hypothesis anyone? :-)
I’m not trying to resolve any quantum interference mysteries with the above, merely anthropic ones. I have absolutely no idea where the born probabilities come from.
Sorry, I was unclear. I meant “if what you say is the correct explanation, then near as I can tell, there shouldn’t be anything resembling quantum interference. In your model, where is there room for things to ‘cancel out’ if copies just keep multiplying like that?”
In your model, where is there room for things to ‘cancel out’ if copies just keep multiplying like that?
Ah, sorry if I wasn’t clear. The copies wouldn’t multiply. In the computer program analogy, you’d have the same number of programs at every time step. So instead of doing this...
For the sake of simplicity this is the same algorithm, but with part of the list not used when working out the next step. If our universe did this, surely at any point in time it would produce exactly the same experimental results as if it didn’t.
If we’re not experiencing the last instant of experienceable existence, I think that may imply that the second model is closer to the truth, and also that someone who runs redundantly on three processors gets three times as much weight as someone who runs on one processor, for the reasons above.
In your model, where is there room for things to ‘cancel out’ if copies just keep multiplying like that?
Ah, sorry if I wasn’t clear. The copies wouldn’t multiply. In the computer program analogy, you’d have the same number of programs at every time step. So instead of doing this...
For the sake of simplicity this is the same algorithm, but with part of the list not used when working out the next step. If our universe did this, surely at any point in time it would produce exactly the same experimental results as if it didn’t.
Maybe branch quantum operations don’t make new copies, but represent already existing but identical copies “becoming” no longer identical?
In the computer program analogy: instead of having one program at time t and n slightly different versions at time t+1, start out with n copies already existing (but identical) at time t, and have each one change in the branching. If you expect a t+2, you need to start with at least n^2 copies.
(That may mean a lot more copies of everything than would otherwise be expected even under many worlds, but even if it’s enough to give this diabolical monster bed-wetting nightmares, by the Occam’s razor that works for predicting physical laws, that’s absolutely fine).
Come to think of it… if this interpretation isn’t true...
or for that matter, even if this is true but it isn’t true that someone who runs redundantly on three processors gets three times as much weight as someone who runs on one processor...
then wouldn’t we be vastly likely to be experiencing the last instant of experienceable existence in the Universe, because that’s where the vast majority of distinct observers would be? Omega-point simulation hypothesis anyone? :-)
But where does all the quantum interference stuff come from then?
I’m not trying to resolve any quantum interference mysteries with the above, merely anthropic ones. I have absolutely no idea where the born probabilities come from.
Sorry, I was unclear. I meant “if what you say is the correct explanation, then near as I can tell, there shouldn’t be anything resembling quantum interference. In your model, where is there room for things to ‘cancel out’ if copies just keep multiplying like that?”
Or did I misunderstand what you were saying?
Ah, sorry if I wasn’t clear. The copies wouldn’t multiply. In the computer program analogy, you’d have the same number of programs at every time step. So instead of doing this...
Step1: “Program”.
Step2: “Program0”, “Program1″.
Step3: “Program00”, “Program01“, “Program10”, “Program11”.
You do this...
Step1: “Program”, “Program”, “Program”, “Program”, …
Step2: “Program0”, “Program1“, “Program0”, “Program1”, …
Step3: “Program00”, “Program01“, “Program10”, “Program11”, …
For the sake of simplicity this is the same algorithm, but with part of the list not used when working out the next step. If our universe did this, surely at any point in time it would produce exactly the same experimental results as if it didn’t.
If we’re not experiencing the last instant of experienceable existence, I think that may imply that the second model is closer to the truth, and also that someone who runs redundantly on three processors gets three times as much weight as someone who runs on one processor, for the reasons above.
Ah, okay. Sorry I misunderstood.
Ah, sorry if I wasn’t clear. The copies wouldn’t multiply. In the computer program analogy, you’d have the same number of programs at every time step. So instead of doing this...
Step1: “Program”.
Step2: “Program0”, “Program1″.
Step3: “Program00”, “Program01“, “Program10”, “Program11”.
You do this...
Step1: “Program”, “Program”, “Program”, “Program”, …
Step2: “Program0”, “Program1“, “Program0”, “Program1”, …
Step3: “Program00”, “Program01“, “Program10”, “Program11”, …
For the sake of simplicity this is the same algorithm, but with part of the list not used when working out the next step. If our universe did this, surely at any point in time it would produce exactly the same experimental results as if it didn’t.