A major problem with Robin’s theory is that it seems to predict things like, We should find ourselves in a universe in which lots of decoherence events have already taken place,” which tendency does not seem especially apparent.
Actually the theory suggests we should find ourselves in a state with near the least feasible number of past decoherence events
I don’t understand this—doesn’t decoherence occur all the time, in every quantum interaction between all amplitudes all the time? So, like for every amptlitude separate enough to be a “particle” in the universe (=factor) every planck time it will decohere with other factors?
I’d also love to know the answer to Peter’s question… A similar question is whether we should expect all worlds to eventually become mangled (assuming the “mangled worlds” model). I understand “world” to mean “somewhat isolated blob of amplitude in an amplitude distribution”—is that right?
Any given instance of decoherence is an interaction between two or more particles. And all known interactions take rather longer than Planck time.
There probably are enough decoherence events in the universe that at least one occurs somewhere in each Plank timeunit. But that doesn’t instantly decohere everything. Other objects remain coherent until they interact with the decohered system, which is limited by the rate at which information propagates (both latency and bandwidth) (unless of course they decohere on their own). i.e. after a blob of amplitude has split, the sub-blobs are only separated along some dimensions of configuration space, and retain the same cross-section along the rest of the dimensions (hence “factors”).
Okay, given one sub-decoherence event per planck time, somewhere in the universe, propagating throughout it at some rate less than or equal to the speed of light...we either have constant (one per planck time or less) full decoherence events after some fixed time as each finishes propagating sufficiently, or we have no full decoherence events at all as the sub-decoherences fail to decohere the whole sufficiently.
The latter seems more realistic, especially given the light speed limit, as the expansion of space can completely causally isolate two parts of the universe preventing the propagation of the decoherence.
So, with this understood, we’re left to determine how large a portion of the universe has to be decohered to qualify as a “decoherence event” in terms of the many worlds theories which rely on the term. I honestly doubt that, once a suitable determination has been made, the events will be infrequent in almost any sense of the word. It really does seem, given the massive quantities of interactions in our universe(even just the causally linked subspace of it we inhabit), that the frequency of decoherence events should be ridiculously high. And given some basic uniformity assumptions, the rate should be quite regular too.
A major problem with Robin’s theory is that it seems to predict things like, We should find ourselves in a universe in which lots of decoherence events have already taken place,” which tendency does not seem especially apparent.
Actually the theory suggests we should find ourselves in a state with near the least feasible number of past decoherence events
I don’t understand this—doesn’t decoherence occur all the time, in every quantum interaction between all amplitudes all the time? So, like for every amptlitude separate enough to be a “particle” in the universe (=factor) every planck time it will decohere with other factors?
Or did I misunderstand something big time here?
Cheers, Peter
I’d also love to know the answer to Peter’s question… A similar question is whether we should expect all worlds to eventually become mangled (assuming the “mangled worlds” model). I understand “world” to mean “somewhat isolated blob of amplitude in an amplitude distribution”—is that right?
The answer to Peter’s question is: no, decoherence doesn’t happen with a constant rate and it certainly doesn’t happen on the Planck time scale.
The answer to your question is that “managled worlds” is a collapse theory: some worlds get managled and go away, leaving other worlds.
Then I’m still unclear about what a world is. Care to explain?
Eliezer gave a simpler answer to my question: “yes”. (I’m still not sure what yours means.)
Back to Peter’s question. What makes you say decoherence doesn’t happen on the Planck time scale? Can you explain that further?
Any given instance of decoherence is an interaction between two or more particles. And all known interactions take rather longer than Planck time.
There probably are enough decoherence events in the universe that at least one occurs somewhere in each Plank timeunit. But that doesn’t instantly decohere everything. Other objects remain coherent until they interact with the decohered system, which is limited by the rate at which information propagates (both latency and bandwidth) (unless of course they decohere on their own). i.e. after a blob of amplitude has split, the sub-blobs are only separated along some dimensions of configuration space, and retain the same cross-section along the rest of the dimensions (hence “factors”).
Okay, given one sub-decoherence event per planck time, somewhere in the universe, propagating throughout it at some rate less than or equal to the speed of light...we either have constant (one per planck time or less) full decoherence events after some fixed time as each finishes propagating sufficiently, or we have no full decoherence events at all as the sub-decoherences fail to decohere the whole sufficiently.
The latter seems more realistic, especially given the light speed limit, as the expansion of space can completely causally isolate two parts of the universe preventing the propagation of the decoherence.
So, with this understood, we’re left to determine how large a portion of the universe has to be decohered to qualify as a “decoherence event” in terms of the many worlds theories which rely on the term. I honestly doubt that, once a suitable determination has been made, the events will be infrequent in almost any sense of the word. It really does seem, given the massive quantities of interactions in our universe(even just the causally linked subspace of it we inhabit), that the frequency of decoherence events should be ridiculously high. And given some basic uniformity assumptions, the rate should be quite regular too.