Yes, that identified him pretty unambiguously as Mr. Rogers… I was trying to figure out the significance of the bloodstained sweater, if he was thus the “Ultimate Battle” version. :)
Oh, incidentally… It should actually be possible to set up the base level computer to run all the programs for equal amounts of time (at least at the base level. Taking into account programs containing other programs is a whole other issue):
Interesting; I confess I hadn’t thought of that at all! Now I wonder if using this rule along with the underlying anthropic premise, would cause subjective experience to dissolve into chaos, or make no discernable difference (i.e. reality still ends up looking just as ordered for the most part), or if it argues against the underlying anthropic premise by showing how easy it is to make probabilities refuse to converge to a timeless limit.
(And yes, it’s that Rogers—you can tell because he’s the closest thing the group has to a leader. One wonders how the blood got on his sweater. Surely it’s not the blood of an enemy, as the original song implies. Perhaps it’s the blood of Big Bird, who died fighting for Amber, or something along those lines.)
Thanks. :) This story simply made me wonder if it was possible to create a “fair” scheduled version of the program-of-all-possible-programs, and this was the first one I came up with. Not sure if there are any more elegant ways of doing that.
Though, of course, this wouldn’t change the whole issue of programs being embedded in other ones… And, actually, my instinct is that, given that all programs are actually run with an unbounded number of cycles, I’m not sure that the RATE at which they’re run relative to each other would affect the amount of reality-fluid each one got, but there’s much here that I’m confused about on that matter...
(EDIT: to clarify, I don’t think the relative rate they’re run at would make any difference.)
Oh, just for clarification: With the bit about Maria distributing the computing time exponentially according to complexity, did you mean each higher (starting?) complexity program got exponentially less time or exponentially more time? And what was Maria’s motivation there for that scheduling rule?
Simpler programs got exponentially more time. Mostly she’s just trying to match the “natural” distribution of programs, if there is such a thing. Allocating more time to simpler programs may help because it means that, e.g., simple programs which also simulate all programs in order, will get a lot of computing power, so it helps equalize the flow in a way that doesn’t depend as much on your initial choice of universal machine. Another way of looking at it would be that allocating equal time to all the programs would tend to make life less simple—to increase the probability of arbitrary things happening—which seems like a net negative for sentient life, ceteris paribus.
(Alternatively, I wonder if Mr. Rogers has a Superpowered Evil Side and that’s how he got the blood on his sweater.)
Okay then. You may want to edit the phrasing. As written in the story, it seemed a bit ambiguous but leaning toward her stating that she set it up to give more complex programs more time. At least so it read to me.
Hrm… super powered evil side for Mr. Rogers. Given that his good side could wrap senate committees around his fingers (Seriously, did you ever watch that vid of him testifying about the importance of not canceling funding for public broadcasting?) just by being in real life the way he was on his show...
But yeah, that story was fun. As delightfully twisted as Fractran. (Yes, I am comparing a story to a model of computation. But, given the nature of the story, is this not perfectly reasonable? :))
I’m not convinced that when you look at the whole set of minds doing dovetailing simulations and put probability distributions on how far they go, your algorithm and Eliezer’s give different results. Actually calculating it out looks a bit tough; my intuition is based on the fact that a simulator doing N computations gives program n of the order of Sqrt(N) computations using either algorithm, provided that n << N.
Well, since any finite number is smaller than infinity, for ANY program, once it starts running, it would get just as many steps per, well, step, as any other program (in the original version). ie, consider two programs A and B such that A came earlier. In the original scheduler, once B started up, for each tick A gets, B would also get one tick. But A would also have an initial bunch of ticks that it got before B even started.
My version makes sure that B gets those extra ticks too, that’s all. I, personally, don’t think it would change the probability distributions that would be experienced from the inside, given that the base computation really is run with unbounded resources and so on and so forth.
Ah, I was thinking more of a huge (infinite?) set of simulators, each running for some finite number of ticks. Then the subjective probability of being in program number n is related to the proportion of simulators that run program n for long enough to reach a feasible world for you to be in. So, sure, program A gets more ticks than B in the original scheduler, but I think the determining factor is how many simulators go on to run B at all.
Ooooooh. No, I guess the model we’re using here (that is, the fanfic in question) is that somewhere down the levels there is a single simulator running a “program of all possible programs”.
Although, I wonder if we can then just say the bottom level is Tegmark’s Level 4 Multiverse and get rid of any actual machine or such at the lowest level. :)
Although, I wonder if we can then just say the bottom level is Tegmark’s Level 4 Multiverse and get rid of any actual machine or such at the lowest level. :)
Tegmark’s Level 4 doesn’t answer the question of how much weight each experience has. It’s a similar problem to asking where do the Born probabilities come from.
Well, it doesn’t seem to me that it’d be any more confusing than “turing machine running the program of all programs” as far as difficulty of reasoning about weights.
For my next trick: A scheduler that can look inside a program, take into account the fact that it’s encoding another program, and schedule stuff so that the total number of ticks that each program gets, summed across all instances of the program, including those embedded in other programs, would be approximately the same for all.
Or I’ll just leave this one as an exercise for the reader. :)
Yes, that identified him pretty unambiguously as Mr. Rogers… I was trying to figure out the significance of the bloodstained sweater, if he was thus the “Ultimate Battle” version. :)
Oh, incidentally… It should actually be possible to set up the base level computer to run all the programs for equal amounts of time (at least at the base level. Taking into account programs containing other programs is a whole other issue):
Program 1: 1 step
Program 1: 1 step
Program 2: 2 steps
Program 1: 1 step
Program 2: 1 step
Program 3: 3 steps
Program 1: 1 step...
...
Program 4: 4 steps
… etc.
Interesting; I confess I hadn’t thought of that at all! Now I wonder if using this rule along with the underlying anthropic premise, would cause subjective experience to dissolve into chaos, or make no discernable difference (i.e. reality still ends up looking just as ordered for the most part), or if it argues against the underlying anthropic premise by showing how easy it is to make probabilities refuse to converge to a timeless limit.
(And yes, it’s that Rogers—you can tell because he’s the closest thing the group has to a leader. One wonders how the blood got on his sweater. Surely it’s not the blood of an enemy, as the original song implies. Perhaps it’s the blood of Big Bird, who died fighting for Amber, or something along those lines.)
Thanks. :) This story simply made me wonder if it was possible to create a “fair” scheduled version of the program-of-all-possible-programs, and this was the first one I came up with. Not sure if there are any more elegant ways of doing that.
Though, of course, this wouldn’t change the whole issue of programs being embedded in other ones… And, actually, my instinct is that, given that all programs are actually run with an unbounded number of cycles, I’m not sure that the RATE at which they’re run relative to each other would affect the amount of reality-fluid each one got, but there’s much here that I’m confused about on that matter...
(EDIT: to clarify, I don’t think the relative rate they’re run at would make any difference.)
Oh, just for clarification: With the bit about Maria distributing the computing time exponentially according to complexity, did you mean each higher (starting?) complexity program got exponentially less time or exponentially more time? And what was Maria’s motivation there for that scheduling rule?
Simpler programs got exponentially more time. Mostly she’s just trying to match the “natural” distribution of programs, if there is such a thing. Allocating more time to simpler programs may help because it means that, e.g., simple programs which also simulate all programs in order, will get a lot of computing power, so it helps equalize the flow in a way that doesn’t depend as much on your initial choice of universal machine. Another way of looking at it would be that allocating equal time to all the programs would tend to make life less simple—to increase the probability of arbitrary things happening—which seems like a net negative for sentient life, ceteris paribus.
(Alternatively, I wonder if Mr. Rogers has a Superpowered Evil Side and that’s how he got the blood on his sweater.)
Okay then. You may want to edit the phrasing. As written in the story, it seemed a bit ambiguous but leaning toward her stating that she set it up to give more complex programs more time. At least so it read to me.
Hrm… super powered evil side for Mr. Rogers. Given that his good side could wrap senate committees around his fingers (Seriously, did you ever watch that vid of him testifying about the importance of not canceling funding for public broadcasting?) just by being in real life the way he was on his show...
But yeah, that story was fun. As delightfully twisted as Fractran. (Yes, I am comparing a story to a model of computation. But, given the nature of the story, is this not perfectly reasonable? :))
Tracked down on your suggestion: super powers indeed!
Told yah. :)
So yeah, a superpowered evil version of Mr. Rogers would be really really scary if one thinks about it. :)
The bloodstained sweater in the original song refers to an urban legend that Mr. Rogers was a Marine Sniper in real life.
I’m not convinced that when you look at the whole set of minds doing dovetailing simulations and put probability distributions on how far they go, your algorithm and Eliezer’s give different results. Actually calculating it out looks a bit tough; my intuition is based on the fact that a simulator doing N computations gives program n of the order of Sqrt(N) computations using either algorithm, provided that n << N.
Well, since any finite number is smaller than infinity, for ANY program, once it starts running, it would get just as many steps per, well, step, as any other program (in the original version). ie, consider two programs A and B such that A came earlier. In the original scheduler, once B started up, for each tick A gets, B would also get one tick. But A would also have an initial bunch of ticks that it got before B even started.
My version makes sure that B gets those extra ticks too, that’s all. I, personally, don’t think it would change the probability distributions that would be experienced from the inside, given that the base computation really is run with unbounded resources and so on and so forth.
Ah, I was thinking more of a huge (infinite?) set of simulators, each running for some finite number of ticks. Then the subjective probability of being in program number n is related to the proportion of simulators that run program n for long enough to reach a feasible world for you to be in. So, sure, program A gets more ticks than B in the original scheduler, but I think the determining factor is how many simulators go on to run B at all.
Ooooooh. No, I guess the model we’re using here (that is, the fanfic in question) is that somewhere down the levels there is a single simulator running a “program of all possible programs”.
Although, I wonder if we can then just say the bottom level is Tegmark’s Level 4 Multiverse and get rid of any actual machine or such at the lowest level. :)
Tegmark’s Level 4 doesn’t answer the question of how much weight each experience has. It’s a similar problem to asking where do the Born probabilities come from.
Well, it doesn’t seem to me that it’d be any more confusing than “turing machine running the program of all programs” as far as difficulty of reasoning about weights.
Awesome. I’d upvote your post twice if I could.
Thanks! :)
For my next trick: A scheduler that can look inside a program, take into account the fact that it’s encoding another program, and schedule stuff so that the total number of ticks that each program gets, summed across all instances of the program, including those embedded in other programs, would be approximately the same for all.
Or I’ll just leave this one as an exercise for the reader. :)
And since somewhere in this discussion the link is deserved: The Ultimate Showdown of Ultimate Destiny