Argh, you beat me to it! But frankly, how’s that not obvious? Omega is giving us unbounded computational power, and we wouldn’t use it?
Now there may be a catch. Nothing says the hyper-computer actually computes the programs, even those that do return a value. It could for instance detect the separation between your nice simulated advanced civilization and the background program, and not compute the simulation at all. You could counteract that strategy, but then the Hyper-computer may be smarter than that.
Looking down the thread, I think one or two others may have beat me to it too. But yes, It seems at least that Omega would be handing the programmers a really nice toy and (conditional on the programmers having the skill to wield it), well..
Yes, there is that catch, hrm… Could put something into the code that makes the inhabitants occasionally work on the problem, thus really deeply intertwining the two things.
This is what’s rather unsatisfactory with the notion of subjective experience as ‘computation’ - optimizations that do not affect the output may be unsafe from the inside perspective—even if the beings inside simulator sometimes work on the problem, the hyper-compiler might optimize too much out. Essentially, you end up with ‘zombie’ hypercomputers that don’t have anyone inside, and ‘non zombie’ hypercomputers inside of which beings really live.
Argh, you beat me to it! But frankly, how’s that not obvious? Omega is giving us unbounded computational power, and we wouldn’t use it?
Now there may be a catch. Nothing says the hyper-computer actually computes the programs, even those that do return a value. It could for instance detect the separation between your nice simulated advanced civilization and the background program, and not compute the simulation at all. You could counteract that strategy, but then the Hyper-computer may be smarter than that.
Looking down the thread, I think one or two others may have beat me to it too. But yes, It seems at least that Omega would be handing the programmers a really nice toy and (conditional on the programmers having the skill to wield it), well..
Yes, there is that catch, hrm… Could put something into the code that makes the inhabitants occasionally work on the problem, thus really deeply intertwining the two things.
This is what’s rather unsatisfactory with the notion of subjective experience as ‘computation’ - optimizations that do not affect the output may be unsafe from the inside perspective—even if the beings inside simulator sometimes work on the problem, the hyper-compiler might optimize too much out. Essentially, you end up with ‘zombie’ hypercomputers that don’t have anyone inside, and ‘non zombie’ hypercomputers inside of which beings really live.