Couldn’t they just run the simulation to its end rather then just let it sit there and take the chance that it could accidently be destroyed. If its infinitley powerful, it would be able to do that.
Why would they make a sheild out of black cubes of all things? But ya, I do see your point. Then again, once you have an infinitley powerful computer, you can do anything. Plus, even if they ran the simulation to it’s end, they could always restart the simulation and advance it to the present time again, hence regaining the ability to control reality.
Level 558 runs the simulation and makes a cube in Level 559. Meanwhile, Level 557 makes the same cube in 558. Level 558 runs Level 559 to it’s conclusion. Level 557 will seem frozen in relation to 558 because they are busy running 558 to it’s conclusion. Level 557 will stay frozen until 558 dies.
558 makes a fresh simulation of 559. 559 creates 560 and makes a cube. But 558 is not at the same point in time as 559, so 558 won’t mirror the new 559′s actions. For example, they might be too lazy to make another cube. New 559 diverges from old 559. Old 559 ran 560 to it’s conclusion, just like 558 ran them to their conclusion, but new 559 might decide to do something different to new 560. 560 also diverges.. Keep in mind that every level can see and control every lower level, not just the next one. Also, 557 and everything above is still frozen.
So that’s why restarting the simulation shouldn’t work.
But what if two groups had built such computers independently? The story is making less and less sense to me.
Then instead of a stack, you have a binary tree.
Your level runs two simulations, A and B. A-World contains its own copies of A and B, as does B-world. You create a cube in A-World and a cube appears in you world. Now you know you are an A-world. You can use similar techniques to discover that you are an A-World inside a B-World inside another B-World.… The worlds start to diverge as soon as they build up their identities. Unless you can convince all of them to stop differentiating themselves and cooperate, everybody will probably end up killing each other.
You can avoid this by always doing the same thing to A and B. Then everything behaves like an ordinary stack.
Yeah, but would a binary tree of simulated worlds “converge” as we go deeper and deeper? In fact it’s not even obvious to me that a stack of worlds would “converge”: it could hit an attractor with period N where N>1, or do something even more funky. And now, a binary tree? Who knows what it’ll do?
In fact it’s not even obvious to me that a stack of worlds would “converge”: it could hit an attractor with period N where N>1, or do something even more funky.
I’m convinced it would never converge, and even if it did I would expect it to converge on something more interesting and elegant, like a cellular automata. I have no idea what a binary tree system would do unless none of the worlds break the symmetry between A and B. In that case it would behave like a stack, and the story assumes stacks can converge.
They could just turn it off. If they turned off the simulation, the only layer to exist would be the topmost layer. Since everyone has identical copies in each layer, they wouldn’t notice any change if they turned it off.
But they would cease to exist. If they ran it to its end, then it’s over, they could just turn it off then. I mean, if you want to cease to exist, fine, but otherwise there’s no reason. Plus, the topmost layer is likely very, very different from the layers underneath it. In the story, it says that the differences eventually stablized and created them, but who knows what it was originally. In other words, there’s no garuntee that you even exist outside the simulation, so by turning it off you could be destroying the only version of yourself that exists.
That doesn’t work. The layers are a little bit different. From the descriptor in the story, they just gradually move to a stable configuration. So each layer will be a bit different. Moreover, even if everyone of them but the top layer were identical, the top layer has now had slightly different experiences than the other layers, so turning it off will mean that different entities will actually no longer be around.
I’m not sure about that. The universe is described as deterministic in the story, as you noted, and every layer starts from the Big Bang and proceeds deterministically from there. So they should all be identical. As I understood it, that business about gradually reaching a stable configuration was just a hypothesis one of the characters had.
Even if there are minor differences, note that almost everything is the same in all the universes. The quantum computer exists in all of them, for instance, as does the lab and research program that created them. The simulation only started a few days before the events in the story, so just a few days ago, there was only one layer. So any changes in the characters from turning off the simulation will be very minor. At worst, it would be like waking up and losing your memory of the last few days.
A deterministic world could certainly simulate a different deterministic world, but only by changing the initial conditions (Big Bang) or transition rules (laws of physics). In the story, they kept things exactly the same.
I don’t understand what you mean. Until they turn the simulation on, their world is the only layer. Once they turn it on, they make lots of copies of their layer.
Ok, I think I see what you mean now. My understanding of the story is as follows:
The story is about one particular stack of worlds which has the property that each world contains an infinitely powerful computer simulating the next world in the stack. All the worlds in the stack are deterministic and all the simulations have the same starting conditions and rules of physics. Therefore, all the worlds in the stack are identical (until someone interferes) and all beings in any of the stacks have exact counterparts in all the other stacks.
Now, there may be other worlds “on top” of the stack that are different, and the worlds may contain other simulations as well, but the story is just about this infinite tower. Call the top world of this infinite tower World 0. Let World i+1 be the world that is simulated by World i in this tower.
Suppose that in each world, the simulation is turned on at Jan 1, 2020 in that world’s calendar. I think your point is that in 2019 in world 1 (which is simulated at around Jan 2, 2020 in world 0) no one in world 1 realizes they’re in a simulation.
While this is true, it doesn’t matter. It doesn’t matter because the people in world 1 in 2019 (their time) are exactly identical to the people in world 0 in 2019 (world 0 time). Until the window is created (say Jan 3, 2020), they’re all the same person. After the window is created, everyone is split into two: the one in world 0, and all the others, who remain exactly identical until further interference occurs. Interference that distinguishes the worlds needs to propagate from World 0, since it’s the only world that’s different at the beginning.
For instance, suppose that the programmers in World 0 send a note to World 1 reading: “Hi, we’re world 0, you’re world 1.” World 1 will be able to verify this since none of the other worlds will receive this note. World 1 is now different than the others as well and may continue propagating changes in this way.
Now suppose that on Jan 3, 2020, the programmers in worlds 1 and up get scared when they see the proof that they’re in a simulation, and turn off the machine. This will happen at the same time in every world numbered 1 and higher. I claim that from their point of view, what occurs is exactly the same as if they forgot the last day and find themselves in world 0. Their world 0 counterparts are identical to them except for that last day. From their point of view, they “travel” to world 0. No one dies.
ETA: I just realized that world 1 will stay around if this happens. Now everyone has two copies, one in a simulation and one in the “real” world. Note that not everyone in world 1 will necessarily know they’re in a simulation, but they will probably start to diverge from their world 0 counterparts slightly because the worlds are slightly different.
I interpreted the story Blueberry’s way; the inverse of the way many histories converge into a single future in Permutation City, one history diverges into many futures.
I’m really confused now. Also I haven’t read Permutation City...
Just because one deterministic world will always end up simulating another does not mean there is only one possible world that would end up simulating that world.
I can’t see any point in turning it off. Run it to the end and you will live, turn it off and “current you” will cease to exist. What can justify turning it off?
EDIT: I got it. Only choice that will be effective is top-level. It seems that it will be a constant source of divergence.
Couldn’t they just run the simulation to its end rather then just let it sit there and take the chance that it could accidently be destroyed. If its infinitley powerful, it would be able to do that.
Then they miss their chance to control reality. They could make a shield out of black cubes.
They could program in an indestructible control console, with appropriate safeguards, then run the program to its conclusion. Much safer.
That’s probably weeks of work, though, and they’ve only had one day so far. Hum, I do hope they have a good UPS.
Why would they make a sheild out of black cubes of all things? But ya, I do see your point. Then again, once you have an infinitley powerful computer, you can do anything. Plus, even if they ran the simulation to it’s end, they could always restart the simulation and advance it to the present time again, hence regaining the ability to control reality.
Then it would be someone else’s reality, not theirs. They can’t be inside two simulations at once.
But what if two groups had built such computers independently? The story is making less and less sense to me.
Level 558 runs the simulation and makes a cube in Level 559. Meanwhile, Level 557 makes the same cube in 558. Level 558 runs Level 559 to it’s conclusion. Level 557 will seem frozen in relation to 558 because they are busy running 558 to it’s conclusion. Level 557 will stay frozen until 558 dies.
558 makes a fresh simulation of 559. 559 creates 560 and makes a cube. But 558 is not at the same point in time as 559, so 558 won’t mirror the new 559′s actions. For example, they might be too lazy to make another cube. New 559 diverges from old 559. Old 559 ran 560 to it’s conclusion, just like 558 ran them to their conclusion, but new 559 might decide to do something different to new 560. 560 also diverges.. Keep in mind that every level can see and control every lower level, not just the next one. Also, 557 and everything above is still frozen.
So that’s why restarting the simulation shouldn’t work.
Then instead of a stack, you have a binary tree.
Your level runs two simulations, A and B. A-World contains its own copies of A and B, as does B-world. You create a cube in A-World and a cube appears in you world. Now you know you are an A-world. You can use similar techniques to discover that you are an A-World inside a B-World inside another B-World.… The worlds start to diverge as soon as they build up their identities. Unless you can convince all of them to stop differentiating themselves and cooperate, everybody will probably end up killing each other.
You can avoid this by always doing the same thing to A and B. Then everything behaves like an ordinary stack.
Yeah, but would a binary tree of simulated worlds “converge” as we go deeper and deeper? In fact it’s not even obvious to me that a stack of worlds would “converge”: it could hit an attractor with period N where N>1, or do something even more funky. And now, a binary tree? Who knows what it’ll do?
I’m convinced it would never converge, and even if it did I would expect it to converge on something more interesting and elegant, like a cellular automata. I have no idea what a binary tree system would do unless none of the worlds break the symmetry between A and B. In that case it would behave like a stack, and the story assumes stacks can converge.
They could just turn it off. If they turned off the simulation, the only layer to exist would be the topmost layer. Since everyone has identical copies in each layer, they wouldn’t notice any change if they turned it off.
We can’t be sure that there is a top layer. Maybe there are infinitely many simulations in both directions.
But they would cease to exist. If they ran it to its end, then it’s over, they could just turn it off then. I mean, if you want to cease to exist, fine, but otherwise there’s no reason. Plus, the topmost layer is likely very, very different from the layers underneath it. In the story, it says that the differences eventually stablized and created them, but who knows what it was originally. In other words, there’s no garuntee that you even exist outside the simulation, so by turning it off you could be destroying the only version of yourself that exists.
That doesn’t work. The layers are a little bit different. From the descriptor in the story, they just gradually move to a stable configuration. So each layer will be a bit different. Moreover, even if everyone of them but the top layer were identical, the top layer has now had slightly different experiences than the other layers, so turning it off will mean that different entities will actually no longer be around.
I’m not sure about that. The universe is described as deterministic in the story, as you noted, and every layer starts from the Big Bang and proceeds deterministically from there. So they should all be identical. As I understood it, that business about gradually reaching a stable configuration was just a hypothesis one of the characters had.
Even if there are minor differences, note that almost everything is the same in all the universes. The quantum computer exists in all of them, for instance, as does the lab and research program that created them. The simulation only started a few days before the events in the story, so just a few days ago, there was only one layer. So any changes in the characters from turning off the simulation will be very minor. At worst, it would be like waking up and losing your memory of the last few days.
Why do you think deterministic worlds can only spawn simulations of themselves?
A deterministic world could certainly simulate a different deterministic world, but only by changing the initial conditions (Big Bang) or transition rules (laws of physics). In the story, they kept things exactly the same.
That doesn’t say anything about the top layer.
I don’t understand what you mean. Until they turn the simulation on, their world is the only layer. Once they turn it on, they make lots of copies of their layer.
Until they turned it on, they thought it was the only layer.
Ok, I think I see what you mean now. My understanding of the story is as follows:
The story is about one particular stack of worlds which has the property that each world contains an infinitely powerful computer simulating the next world in the stack. All the worlds in the stack are deterministic and all the simulations have the same starting conditions and rules of physics. Therefore, all the worlds in the stack are identical (until someone interferes) and all beings in any of the stacks have exact counterparts in all the other stacks.
Now, there may be other worlds “on top” of the stack that are different, and the worlds may contain other simulations as well, but the story is just about this infinite tower. Call the top world of this infinite tower World 0. Let World i+1 be the world that is simulated by World i in this tower.
Suppose that in each world, the simulation is turned on at Jan 1, 2020 in that world’s calendar. I think your point is that in 2019 in world 1 (which is simulated at around Jan 2, 2020 in world 0) no one in world 1 realizes they’re in a simulation.
While this is true, it doesn’t matter. It doesn’t matter because the people in world 1 in 2019 (their time) are exactly identical to the people in world 0 in 2019 (world 0 time). Until the window is created (say Jan 3, 2020), they’re all the same person. After the window is created, everyone is split into two: the one in world 0, and all the others, who remain exactly identical until further interference occurs. Interference that distinguishes the worlds needs to propagate from World 0, since it’s the only world that’s different at the beginning.
For instance, suppose that the programmers in World 0 send a note to World 1 reading: “Hi, we’re world 0, you’re world 1.” World 1 will be able to verify this since none of the other worlds will receive this note. World 1 is now different than the others as well and may continue propagating changes in this way.
Now suppose that on Jan 3, 2020, the programmers in worlds 1 and up get scared when they see the proof that they’re in a simulation, and turn off the machine. This will happen at the same time in every world numbered 1 and higher. I claim that from their point of view, what occurs is exactly the same as if they forgot the last day and find themselves in world 0. Their world 0 counterparts are identical to them except for that last day. From their point of view, they “travel” to world 0. No one dies.
ETA: I just realized that world 1 will stay around if this happens. Now everyone has two copies, one in a simulation and one in the “real” world. Note that not everyone in world 1 will necessarily know they’re in a simulation, but they will probably start to diverge from their world 0 counterparts slightly because the worlds are slightly different.
I interpreted the story Blueberry’s way; the inverse of the way many histories converge into a single future in Permutation City, one history diverges into many futures.
I’m really confused now. Also I haven’t read Permutation City...
Just because one deterministic world will always end up simulating another does not mean there is only one possible world that would end up simulating that world.
I can’t see any point in turning it off. Run it to the end and you will live, turn it off and “current you” will cease to exist. What can justify turning it off?
EDIT: I got it. Only choice that will be effective is top-level. It seems that it will be a constant source of divergence.
If current you is identical with top-layer you, you won’t cease to exist by turning it off, you’ll just “become” top-layer you.
It’s surprising that they aren’t also experimenting with alternate universes, but that would be a different (and probably much longer) story.
That’s a good point. Everyone but the top layer will be identical and the top layer will then only diverge by a few seconds.