This is a very interesting point and somehow shakes my belief in the current version of MWI.
What I could imagine is that since the total information content of multiverse must be finite, there is some additional quantification going on that makes highly improbable branches “too fuzzy” to be observable. Or something like that.
We have seen in the sister topic that mangled worlds theory can in fact account for such information loss. However MWT has similar deficiencies as single worlds: non local action, nonlinearity, discontinuity. It does not mean it can’t be true.
I would not state this for sure. There could still be quite a difference between astronomically unlikely and superastronomically unlikely.
So for example if the total information content of the multiverse is bounded by 2^1000 bits, you could go down to an insanely small probability of 1/2^(2^1000) but not to the “merely” 1/2^(2^1000) times less probable 1/2^(2^1001) .
Why would the information content of a quantum universe be measured in bits, rather than qubits? 2^1000 qubits is enough to keep track of every possible configuration of the Hubble volume, without discarding any low magnitude ones. (Unless of course QM does discard low magnitude branches, in which case your quantum computer would too… but such a circular definition is consistent with any amount of information content.)
This is a very interesting point and somehow shakes my belief in the current version of MWI.
What I could imagine is that since the total information content of multiverse must be finite, there is some additional quantification going on that makes highly improbable branches “too fuzzy” to be observable. Or something like that.
Not likely. You’re already in a highly improbable branch, and it’s getting less probable every millisecond.
We have seen in the sister topic that mangled worlds theory can in fact account for such information loss. However MWT has similar deficiencies as single worlds: non local action, nonlinearity, discontinuity. It does not mean it can’t be true.
I would not state this for sure. There could still be quite a difference between astronomically unlikely and superastronomically unlikely.
So for example if the total information content of the multiverse is bounded by 2^1000 bits, you could go down to an insanely small probability of 1/2^(2^1000) but not to the “merely” 1/2^(2^1000) times less probable 1/2^(2^1001) .
Why would the information content of a quantum universe be measured in bits, rather than qubits? 2^1000 qubits is enough to keep track of every possible configuration of the Hubble volume, without discarding any low magnitude ones. (Unless of course QM does discard low magnitude branches, in which case your quantum computer would too… but such a circular definition is consistent with any amount of information content.)