Do I think we’re in a simulation? No. But though the reasons why a perfect simulation is possible aren’t necessarily obvious, but they are compelling.
Quantum mechanical computation depends on the energy splitting (energy difference) between different levels - different energy levels of a quantum mechanical system oscillate relative to each other, and bigger energy splittings mean faster relative oscillations, which means you can get more computation done. So if you want to simulate a quantum system perfectly in faster than real time, you just have to make a model that is higher energy. The cool thing is that the model doesn’t necessarily have to be arranged like the actual system: quantum computers designed to simulate chemical reactions can be just a line or grid of atoms linked by light—as long as the interactions between the atoms are proportional to the interactions in the modeled system, the computer works fine. This would allow, for example, a spatially 3d universe to by simulated within a 5d universe just by making the right connections.
Actually, now that I think about it, that may not be the heart of your post—it may be speculation about “subjective experience” rather than the practicality of simulations, which would make it even worse than I’d first thought.
Yes, you could in principle create a dissimilar but isomorphic quantum system to simulate reality. My argument is that the real one will take less stuff to build by a very large factor, where the factor is large enough that “stuff” can be validly taken to mean any of matter, energy, or negentropy.
Phew, I’m relieved your argument isn’t something like “a simulation would by assumption be ‘grainier’ than a natural universe, and so it would ‘split’ less often, and so have less ‘subjective experience.’”
As to it being a gigantic pain in the ass to simulate an entire universe—sure, and it’s unlikely that we’re in a simulation. But ignoring units is typically only done when even the exponent is huge, since 10^10^10 meters is 10^(10^10 − 3) kilometers, which is still pretty much 10^10^10. On the other hand, it should only take some well-designed nanotech to keep things running, which is a factor of 10^20 at the worst, which isn’t a huge exponent. It’s certainly more than we have in our universe, but it’s well within what we could have if we had a few extra spatial dimensions or a different history of our vacuum energy or something.
The interesting question is: “do universes exist with a higher computational capacity than ours? How much higher? Orders of magnitude higher? Degrees of infinity higher? Arbitrarily higher? ”
To clarify: I mean that a sim would either be “grainier”, not in any sense that would be detectable from inside, but just in the sense that it used some pseudorandom numbers as a proxy for quantum branching; or bigger in terms of stuff; or both (because there’s plenty of orders of magnitude to spread between those options.
As to “well-designed nanotech” on the order of 10^20… that’s vaguely plausible, but it’s also plausible that that just wouldn’t be able to handle the wide varieties of quantum entanglement that matter in the world we observe. Remember, even simple facts like “light travels in a straight line” are, at root, a result of quantum interference, conceivable as infinite numbers of Feynman diagrams. While it is certainly possible to create heuristics, perhaps even perfect algorithms, to reproduce any one quantum effect like that, I’m skeptical that you can just induct from there up to the quantum soup we swim in. So I’d still guess 10^(10^x) with x>=2 (note: I had said x=10 but on second thought it’s probably either impossible or easier than that).
Do I think we’re in a simulation? No. But though the reasons why a perfect simulation is possible aren’t necessarily obvious, but they are compelling.
Quantum mechanical computation depends on the energy splitting (energy difference) between different levels - different energy levels of a quantum mechanical system oscillate relative to each other, and bigger energy splittings mean faster relative oscillations, which means you can get more computation done. So if you want to simulate a quantum system perfectly in faster than real time, you just have to make a model that is higher energy. The cool thing is that the model doesn’t necessarily have to be arranged like the actual system: quantum computers designed to simulate chemical reactions can be just a line or grid of atoms linked by light—as long as the interactions between the atoms are proportional to the interactions in the modeled system, the computer works fine. This would allow, for example, a spatially 3d universe to by simulated within a 5d universe just by making the right connections.
Actually, now that I think about it, that may not be the heart of your post—it may be speculation about “subjective experience” rather than the practicality of simulations, which would make it even worse than I’d first thought.
Yes, you could in principle create a dissimilar but isomorphic quantum system to simulate reality. My argument is that the real one will take less stuff to build by a very large factor, where the factor is large enough that “stuff” can be validly taken to mean any of matter, energy, or negentropy.
Phew, I’m relieved your argument isn’t something like “a simulation would by assumption be ‘grainier’ than a natural universe, and so it would ‘split’ less often, and so have less ‘subjective experience.’”
As to it being a gigantic pain in the ass to simulate an entire universe—sure, and it’s unlikely that we’re in a simulation. But ignoring units is typically only done when even the exponent is huge, since 10^10^10 meters is 10^(10^10 − 3) kilometers, which is still pretty much 10^10^10. On the other hand, it should only take some well-designed nanotech to keep things running, which is a factor of 10^20 at the worst, which isn’t a huge exponent. It’s certainly more than we have in our universe, but it’s well within what we could have if we had a few extra spatial dimensions or a different history of our vacuum energy or something.
The interesting question is: “do universes exist with a higher computational capacity than ours? How much higher? Orders of magnitude higher? Degrees of infinity higher? Arbitrarily higher? ”
To clarify: I mean that a sim would either be “grainier”, not in any sense that would be detectable from inside, but just in the sense that it used some pseudorandom numbers as a proxy for quantum branching; or bigger in terms of stuff; or both (because there’s plenty of orders of magnitude to spread between those options.
As to “well-designed nanotech” on the order of 10^20… that’s vaguely plausible, but it’s also plausible that that just wouldn’t be able to handle the wide varieties of quantum entanglement that matter in the world we observe. Remember, even simple facts like “light travels in a straight line” are, at root, a result of quantum interference, conceivable as infinite numbers of Feynman diagrams. While it is certainly possible to create heuristics, perhaps even perfect algorithms, to reproduce any one quantum effect like that, I’m skeptical that you can just induct from there up to the quantum soup we swim in. So I’d still guess 10^(10^x) with x>=2 (note: I had said x=10 but on second thought it’s probably either impossible or easier than that).