Right, but (before reading your post) I had assumed that the eigenvectors somehow “popped out” of the Everett interpretation.
This is a bit of a tangent but decoherence isn’t exclusive to the Everett interpretation. Decoherence is itself a measurable physical process independent of the interpretation one favors. So explanations which rely on decoherence are part of all interpretations.
I mean in the setup you describe there isn’t any reason why we can’t call the “state space” the observer space and the observer “the system being studied” and then write down the same system from the other point of view...
In the derivations of decoherence you make certain approximations which loosely speaking depend on the environment being big relative to the quantum system. If you change the roles these approximations aren’t valid any more. I’m not sure if we are on the same page regarding decoherence, though (see my other reply to your post).
What goes wrong if we just take our “base states” as discrete objects and try to model QM as the evolution of probability distributions over ordered pairs of these states?
You might be interested in Lucien Hardy’s attempt to find a more intuitive set of axioms for QM compared to the abstractness of the usual presentation: https://arxiv.org/abs/quant-ph/0101012
Isn’t the whole point of the Everett interpretation that there is no decoherence? We have a Hilbert space for the system, and a Hilbert space for the observer, and a unitary evolution on the tensor product space of the system. With these postulates (and a few more), we can start with a pure state and end up with some mixed tensor in the product space, which we then interpret as being “multiple observers”, right? I mean this is how I read your paper.
We are surely not on the same page regarding decoherence, as I know almost nothing about it :)
The arxiv-link looks interesting, I should have a look at it.
Yes, the coherence-based approach (Everett’s original paper, early MWI) is quite different to the decoherence-based approach (Dieter Zeh, post 1970).
Deutsch uses the coherence based approach, while most other many worlders use the decoherence based approach.
He absolutely does establish that quantum computing is superior to classical computing, that underlying reality is not classical, and that the superiority of quantum computing requires some extra structure to reality. What the coherence based approach does not establish is whether the extra structure adds up to something that could be called “alternate worlds” or parallel universes , in the sense familiar from science fiction.
In the coherence based approach, Worlds” are coherent superpositions.That means they in exist at small scales, they can continue to interact with each other, after, “splitting” , and they can be erased. These coherent superposed states are the kind of “world” we have direct evidence for, although they seem to lack many of the properties requited for a fully fledged many worlds theory, hence the scare quotes.
In particular, if you just model the wave function, the only results you will get represent every possible outcome. In order to match observation , you will have to keep discarding unobserved outcomes and renormalising as you do in every interpretation. It’s just that that extra stage is performed manually, not by the programme.
This is a bit of a tangent but decoherence isn’t exclusive to the Everett interpretation. Decoherence is itself a measurable physical process independent of the interpretation one favors. So explanations which rely on decoherence are part of all interpretations.
In the derivations of decoherence you make certain approximations which loosely speaking depend on the environment being big relative to the quantum system. If you change the roles these approximations aren’t valid any more. I’m not sure if we are on the same page regarding decoherence, though (see my other reply to your post).
You might be interested in Lucien Hardy’s attempt to find a more intuitive set of axioms for QM compared to the abstractness of the usual presentation: https://arxiv.org/abs/quant-ph/0101012
Isn’t the whole point of the Everett interpretation that there is no decoherence? We have a Hilbert space for the system, and a Hilbert space for the observer, and a unitary evolution on the tensor product space of the system. With these postulates (and a few more), we can start with a pure state and end up with some mixed tensor in the product space, which we then interpret as being “multiple observers”, right? I mean this is how I read your paper.
We are surely not on the same page regarding decoherence, as I know almost nothing about it :)
The arxiv-link looks interesting, I should have a look at it.
Yes, the coherence-based approach (Everett’s original paper, early MWI) is quite different to the decoherence-based approach (Dieter Zeh, post 1970).
Deutsch uses the coherence based approach, while most other many worlders use the decoherence based approach.
He absolutely does establish that quantum computing is superior to classical computing, that underlying reality is not classical, and that the superiority of quantum computing requires some extra structure to reality. What the coherence based approach does not establish is whether the extra structure adds up to something that could be called “alternate worlds” or parallel universes , in the sense familiar from science fiction.
In the coherence based approach, Worlds” are coherent superpositions.That means they in exist at small scales, they can continue to interact with each other, after, “splitting” , and they can be erased. These coherent superposed states are the kind of “world” we have direct evidence for, although they seem to lack many of the properties requited for a fully fledged many worlds theory, hence the scare quotes.
In particular, if you just model the wave function, the only results you will get represent every possible outcome. In order to match observation , you will have to keep discarding unobserved outcomes and renormalising as you do in every interpretation. It’s just that that extra stage is performed manually, not by the programme.