How would you compute that in one sweep-through, without any higher-order metatime?
A way has occurred to me.
Take the basic program described in the beginning of this post, in which the universe is deterministically computed with a cached series of states of the universe. The change is to make this computation is parallel on a staggering scale, because of how Time-Turners work. I’m going to explain this like there’s only one wizard with a Time-Turner that works for one hour, but I’m pretty sure it holds up for the complex case of many Turners that can go back a varying amount of time.
A wizard (Marty McFloat, let’s say) carrying a Time-Turner constantly generates a huge number new copies of the universe that differ slightly from the ‘real’ universe in a way that has gone unobserved because of the anthropic principle. In the ‘main track’ of the universe, nothing interesting happens, a new state is computed from the previous state. Every other copy of the universe is the same except that a rough copy of the wizard has appeared.
Maybe this copy of Marty has a new life vest, or has a bruise, or just has his head turned slightly to the left and one leg tensed. There are a finite but huge number of these variations, including copies where only a single quark (or whatever the smallest computed unit of manner is) is out of place. But in every variation, this copy of Marty’s brain has an extra hour of memories in its head. (More variations!)
Let’s follow one of these, the Marty with a new vest. This is a potential future Marty. You can think of this as the appearance of a Marty who went clothes shopping and used his Time-Turner an hour from now, but it’s not: it’s a variation of the Marty wearing the Time-Turner, not a copy of part of a future computed state. Every possible Marty-variant is generated in a different parallel universe state.
The state of the universe is computed onwards. If, one hour later, Marty does not activate his Time-Turner, the universe fails its consistency check and is deleted. If Marty does activate it, the universe looks back at the Marty that was added an hour ago. If the two Martys are not bit-for-bit (at whatever the lowest scale of the computation is), the universe fails its consistency check and is deleted. If Marty is identical, the ‘younger’ one that activated the Time-Turner is deleted and the universe rolls on in a consistent, causal way.
This system has no metatime. Universe computation never has to go back and restart from a previous state, modified or modified. It just requires generating a near-infinite number of parallel branches for every state of the universe and destroying nearly all of them. (Which I guess is quite a lot of world-killing, to understate it.)
The universe is causal and can be modeled with a directed acyclic graph. Just imagine that each state of the universe is a DAG, which may include a new node with no parents (the ‘arriving’ variant wizard), and it’s not one DAG but an incredibly thick stack of DAGs, most of which are discarded. The universe never needs to compute (p8pm|9pm).
If I correctly understood the prompt (does having multiple copies of the timeline count as “higher order metatime”, or does that just mean “no rolling the clock back” as in the example?), I think this perverse solution satisfies the constraints of the question I quoted, but I’d love to hear correction.
As a variation, nothing really requires that universes be computed in parallel; as long as the computer has lots of disk space it could write out the generated parallel universes to disk and continue simulating the current universe until it fails the consistency check or ends, then pick any remaining state from disk and pick up computation from wherever that state left off. This is a in-fiction way of restating that you can trade off space for parallelism in computation, but I’m not entirely certain what “higher-order” precludes so I wanted to toss it out there as a variation.
Actually, this whole post is an example of the general principle that you can trade off space for time in programs. It just ends up looking really weird when you push this optimization technique to a ridiculous scale. As for who would simulate a universe this way, I would guess some poor sap who was overruled in the design meetings and told to “just make it work”.
(On a meta note, I’ve been wondering for a few years if anything would prompt me to create a LessWrong account and participate. I read this post from my feed reader this morning, went on to have a day that’s been very busy and meaningful for my personal life. I didn’t think about this post at any point, went to bed, and woke up three hours later with this answer fully-formed except for the one-simulation-at-a-time variation I thought of while typing it up so I can get it out of my head and go back to bed. I guess waking me up with a ridiculous answer to a puzzle I didn’t know I was chewing on will prompt it.)
If, one hour later, Marty does not activate his Time-Turner, the universe fails its consistency check and is deleted. If Marty does activate it, the universe looks back at the Marty that was added an hour ago. If the two Martys are not bit-for-bit (at whatever the lowest scale of the computation is), the universe fails its consistency check and is deleted.
fails its consistency check and is deleted
By whom? The DM? Jokes aside, how does that happen, exactly?
(As a matter of fact, that could be an amusing mechanic to add to games that allow for time travel, though the players would be stuck in a Groundhog Day Loop until they pass the check).
There is a game which does exactly this. It’s called Chronotron, and you play a time-traveling robot who has to create multiple versions of himself via time travel in order to complete each stage. The loop must be closed for all iterations, though, which means that if a future version of himself interacts with a past version of himself in such a way that the past version is prevented from going back in time, you lose the stage and have to start over.
Pretty fun game, although since it’s a Flash game it’s relatively short.
I was continuing on the post’s opening thought experiment of a computed universe; I was thinking whatever program is computing the new states of the universe would do this check. Sorry for the confusion.
A way has occurred to me.
Take the basic program described in the beginning of this post, in which the universe is deterministically computed with a cached series of states of the universe. The change is to make this computation is parallel on a staggering scale, because of how Time-Turners work. I’m going to explain this like there’s only one wizard with a Time-Turner that works for one hour, but I’m pretty sure it holds up for the complex case of many Turners that can go back a varying amount of time.
A wizard (Marty McFloat, let’s say) carrying a Time-Turner constantly generates a huge number new copies of the universe that differ slightly from the ‘real’ universe in a way that has gone unobserved because of the anthropic principle. In the ‘main track’ of the universe, nothing interesting happens, a new state is computed from the previous state. Every other copy of the universe is the same except that a rough copy of the wizard has appeared.
Maybe this copy of Marty has a new life vest, or has a bruise, or just has his head turned slightly to the left and one leg tensed. There are a finite but huge number of these variations, including copies where only a single quark (or whatever the smallest computed unit of manner is) is out of place. But in every variation, this copy of Marty’s brain has an extra hour of memories in its head. (More variations!)
Let’s follow one of these, the Marty with a new vest. This is a potential future Marty. You can think of this as the appearance of a Marty who went clothes shopping and used his Time-Turner an hour from now, but it’s not: it’s a variation of the Marty wearing the Time-Turner, not a copy of part of a future computed state. Every possible Marty-variant is generated in a different parallel universe state.
The state of the universe is computed onwards. If, one hour later, Marty does not activate his Time-Turner, the universe fails its consistency check and is deleted. If Marty does activate it, the universe looks back at the Marty that was added an hour ago. If the two Martys are not bit-for-bit (at whatever the lowest scale of the computation is), the universe fails its consistency check and is deleted. If Marty is identical, the ‘younger’ one that activated the Time-Turner is deleted and the universe rolls on in a consistent, causal way.
This system has no metatime. Universe computation never has to go back and restart from a previous state, modified or modified. It just requires generating a near-infinite number of parallel branches for every state of the universe and destroying nearly all of them. (Which I guess is quite a lot of world-killing, to understate it.)
The universe is causal and can be modeled with a directed acyclic graph. Just imagine that each state of the universe is a DAG, which may include a new node with no parents (the ‘arriving’ variant wizard), and it’s not one DAG but an incredibly thick stack of DAGs, most of which are discarded. The universe never needs to compute (p8pm|9pm).
If I correctly understood the prompt (does having multiple copies of the timeline count as “higher order metatime”, or does that just mean “no rolling the clock back” as in the example?), I think this perverse solution satisfies the constraints of the question I quoted, but I’d love to hear correction.
As a variation, nothing really requires that universes be computed in parallel; as long as the computer has lots of disk space it could write out the generated parallel universes to disk and continue simulating the current universe until it fails the consistency check or ends, then pick any remaining state from disk and pick up computation from wherever that state left off. This is a in-fiction way of restating that you can trade off space for parallelism in computation, but I’m not entirely certain what “higher-order” precludes so I wanted to toss it out there as a variation.
Actually, this whole post is an example of the general principle that you can trade off space for time in programs. It just ends up looking really weird when you push this optimization technique to a ridiculous scale. As for who would simulate a universe this way, I would guess some poor sap who was overruled in the design meetings and told to “just make it work”.
(On a meta note, I’ve been wondering for a few years if anything would prompt me to create a LessWrong account and participate. I read this post from my feed reader this morning, went on to have a day that’s been very busy and meaningful for my personal life. I didn’t think about this post at any point, went to bed, and woke up three hours later with this answer fully-formed except for the one-simulation-at-a-time variation I thought of while typing it up so I can get it out of my head and go back to bed. I guess waking me up with a ridiculous answer to a puzzle I didn’t know I was chewing on will prompt it.)
By whom? The DM? Jokes aside, how does that happen, exactly?
(As a matter of fact, that could be an amusing mechanic to add to games that allow for time travel, though the players would be stuck in a Groundhog Day Loop until they pass the check).
There is a game which does exactly this. It’s called Chronotron, and you play a time-traveling robot who has to create multiple versions of himself via time travel in order to complete each stage. The loop must be closed for all iterations, though, which means that if a future version of himself interacts with a past version of himself in such a way that the past version is prevented from going back in time, you lose the stage and have to start over.
Pretty fun game, although since it’s a Flash game it’s relatively short.
I was continuing on the post’s opening thought experiment of a computed universe; I was thinking whatever program is computing the new states of the universe would do this check. Sorry for the confusion.