You claim to be able to simulate 3^^^^3 unique minds.
It takes log(3^^^^3) bits just to count that many things, so my absolute upper bound on the prior for an agent capable of doing this is 1/3^^^^3.
My brain is unable to process enough evidence to overcome this, so unless you can use your matrix powers to give me access to sufficient computing power to change my mind, get lost.
My response to the scientist:
Why yes, you do have sufficient evidence to overturn our current model of the universe, and if your model is sufficiently accurate, the computational capacity of the universe is vastly larger than we thought.
Let’s try building a computer based on your model and see if it works.
I was thinking that using (length of program) + (memory required to run program) as a penalty makes more sense to me than (length of program) + (size of impact). I am assuming that any program that can simulate X minds must be able to handle numbers the size of X, so it would need more than log(X) bits of memory, which makes the prior less than 2^-log(X).
I wouldn’t be overly surprised if there were some other situation that breaks this idea too, but I was just posting the first thing that came to mind when I read this.
Edit: formatting fixed. Thanks, wedrifid.
My response to the mugger:
You claim to be able to simulate 3^^^^3 unique minds.
It takes log(3^^^^3) bits just to count that many things, so my absolute upper bound on the prior for an agent capable of doing this is 1/3^^^^3.
My brain is unable to process enough evidence to overcome this, so unless you can use your matrix powers to give me access to sufficient computing power to change my mind, get lost.
My response to the scientist:
Why yes, you do have sufficient evidence to overturn our current model of the universe, and if your model is sufficiently accurate, the computational capacity of the universe is vastly larger than we thought.
Let’s try building a computer based on your model and see if it works.
Try an additional linebreak before the first bullet point.
Why does that prior follow from the counting difficulty?
I was thinking that using (length of program) + (memory required to run program) as a penalty makes more sense to me than (length of program) + (size of impact). I am assuming that any program that can simulate X minds must be able to handle numbers the size of X, so it would need more than log(X) bits of memory, which makes the prior less than 2^-log(X).
I wouldn’t be overly surprised if there were some other situation that breaks this idea too, but I was just posting the first thing that came to mind when I read this.
You’re trying to italicize those long statements? It’s possible that you need to get rid of the spaces around the asterisks.
But you’re probably better off just using quote boxes with “>” instead.