There is a fundamental problem with trying to implement relativity in an interpretation of quantum theory which says that the ultimate reality is an assignment of amplitudes to a set of purely spatial “universe configurations”, namely: in such a framework, what is a Lorentz transformation? A Lorentz transformation inherently takes as its input a time series of spacelike hypersurfaces, and produces as its output another time series of spacelike hypersurfaces, produced by chopping up the first series and reassembling the parts.
But where, in the picture offered so far, do we even have a time series of hypersurfaces? Well, we have the histories—trajectories in configuration space—which enter into the Feynman path integral. Since whole histories, and not just configurations, have associated amplitudes, this suggests a way to implement Lorentz invariance: if history H has amplitude A, then history H’ produced by a Lorentz transformation of H should also have amplitude A. (Or covariance: history H’ produced by Lorentz transformation L should have amplitude L(A), where the amplitude-transforming functions L() should combine according to the Lorentz algebra.)
To really work, this seems to require full-fledged cosmological histories—you can’t just talk about finite-time transitions from one hypersurface to another, because under a boost they’ll be broken up in an ugly way (that can’t be represented as a trajectory in your cosmic configuration space)… Basically, the picture I get from this is that histories, complete cosmic histories, and not configurations, are the entities with which amplitudes should be fundamentally associated. There’s no problem in thinking 4-dimensionally about a history. But you’ll then face the problem of getting Born’s rule back. I have no idea how hard that will be. In the many-worlds formalism called “decoherent” or “consistent” histories, it is taken for granted that you cannot work with completely fine-grained histories, such as those which notionally enter into a path integral, and make that formalism work. But maybe it’s different if you work from the start in the full space of histories (dominated as it is by continuous but nondifferentiable trajectories); or maybe quantum gravity requires you to work with some discrete fundamental variables which reduces the space of histories to countable size.
Jess, I think you will find that the sense in which QFT is Lorentz-covariant does not easily carry over to any “realist” interpretation. In a sense, I was just addressing those difficulties. Yes, QFT gives you a 4-dimensional perspective on things: you can view an observed transition as a superposition of space-time histories, and you can change reference frames (recoordinatize the component histories in a synchronized way) without the ultimate probability changing. But when you ask what’s real, when you try to turn that into an ontology, this configuration-based approach runs into problems, unless you switch to thinking of histories as fundamental. At least, that’s the only answer I can see.
There is a fundamental problem with trying to implement relativity in an interpretation of quantum theory which says that the ultimate reality is an assignment of amplitudes to a set of purely spatial “universe configurations”, namely: in such a framework, what is a Lorentz transformation? A Lorentz transformation inherently takes as its input a time series of spacelike hypersurfaces, and produces as its output another time series of spacelike hypersurfaces, produced by chopping up the first series and reassembling the parts.
But where, in the picture offered so far, do we even have a time series of hypersurfaces? Well, we have the histories—trajectories in configuration space—which enter into the Feynman path integral. Since whole histories, and not just configurations, have associated amplitudes, this suggests a way to implement Lorentz invariance: if history H has amplitude A, then history H’ produced by a Lorentz transformation of H should also have amplitude A. (Or covariance: history H’ produced by Lorentz transformation L should have amplitude L(A), where the amplitude-transforming functions L() should combine according to the Lorentz algebra.)
To really work, this seems to require full-fledged cosmological histories—you can’t just talk about finite-time transitions from one hypersurface to another, because under a boost they’ll be broken up in an ugly way (that can’t be represented as a trajectory in your cosmic configuration space)… Basically, the picture I get from this is that histories, complete cosmic histories, and not configurations, are the entities with which amplitudes should be fundamentally associated. There’s no problem in thinking 4-dimensionally about a history. But you’ll then face the problem of getting Born’s rule back. I have no idea how hard that will be. In the many-worlds formalism called “decoherent” or “consistent” histories, it is taken for granted that you cannot work with completely fine-grained histories, such as those which notionally enter into a path integral, and make that formalism work. But maybe it’s different if you work from the start in the full space of histories (dominated as it is by continuous but nondifferentiable trajectories); or maybe quantum gravity requires you to work with some discrete fundamental variables which reduces the space of histories to countable size.
Jess, I think you will find that the sense in which QFT is Lorentz-covariant does not easily carry over to any “realist” interpretation. In a sense, I was just addressing those difficulties. Yes, QFT gives you a 4-dimensional perspective on things: you can view an observed transition as a superposition of space-time histories, and you can change reference frames (recoordinatize the component histories in a synchronized way) without the ultimate probability changing. But when you ask what’s real, when you try to turn that into an ontology, this configuration-based approach runs into problems, unless you switch to thinking of histories as fundamental. At least, that’s the only answer I can see.