The framework of Wheeler-Feynman theory is just classical Maxwell electrodynamics with waves that converge on a charged particle as well as waves that spread from a charged particle. So it ought to be just as relativistic and local and deterministic as it usually is, except that now you’re interested in solutions that have two oppositely directed arrows of time, rather than just one. (Remember that the equations themselves are time-symmetric, so “emission” of radiation can, in principle, run in either direction.)
In practice, they artificially hacked with the theory to remove self-interactions of particles (particle absorbing its own emissions at a later or earlier time), because that produced incalculable infinite forces; but then they were unable to account for the Lamb shift, which does come from self-interaction; and then somehow Feynman made the leap to path integrals, and in the quantum framework they could deal with the infinities of self-interaction through renormalization.
It may seem like a big leap from the classical to the quantum picture. But classical dynamics can be expressed as wave motion in configuration space via the Hamilton-Jacobi equation, and it’s not a big step from the HJE to quantum mechanics. Also, doing anything practical with path integrals usually involves working with classical solutions to the equation of motion, which in the quantum theory have high amplitude, and then looking at corrections which come from neighboring histories.
It’s quite conceivable that these quantum deviations from classicality may result from the interference of forward causality and retrocausality. Maybe the Wheeler-Feynman theory just needs some extra ingredient, like micro time loops from general relativity, in order to become consistent. We would be dealing with a single-world model which is locally causal but not globally causal, in the sense that the future would also be shaped by the distribution of micro time loops, and that’s not determined by its current state. Our world would be one of an ensemble of self-contained, globally consistent “classical” histories, and the quantum probability calculus (including the Born rule) would just turn out to be how to do probability theory in a world where influences come from the future as well as from the past. For example, the Aharanov “two-state-vector formalism” might show up as the way to do statistical mechanics if you know yourself to be living in such an ensemble. There would be no ontological superpositions. Wavefunctions would just be “probability distributions with a future component”.
The status of these speculations is remarkably similar to the status of many worlds. The construction of an exact theory along these lines, with a clear explanation of how it connects to reality, remains elusive, but you can assemble suggestive facts from the quantum formalism to make it plausible, and there is a long tradition of people trying to make it work, one way or another: Wheeler and Feynman, John Cramer, Yakir Aharonov.
Practical QM contains the dualism of wavefunctions and classical observables. Many worlds reifies just the wavefunction and tries to find the observables in it. Retrocausality just keeps the classical part and tries to explain the wavefunction as something to do with forwards and backwards causality. Bohmian mechanics keeps the wavefunction and then fleshes out the classical part in a way governed by the wavefunction. Nomological Bohmian mechanics keeps the classical part of Bohmian mechanics, and replaces the wavefunction with an additional nonlocal potential in the classical equations of motion. If you could obtain that nonlocal potential from a local retrocausal theory, you would finally have an exact, single-world, deterministic explanation of quantum mechanics.
The framework of Wheeler-Feynman theory is just classical Maxwell electrodynamics with waves that converge on a charged particle as well as waves that spread from a charged particle. So it ought to be just as relativistic and local and deterministic as it usually is, except that now you’re interested in solutions that have two oppositely directed arrows of time, rather than just one. (Remember that the equations themselves are time-symmetric, so “emission” of radiation can, in principle, run in either direction.)
In practice, they artificially hacked with the theory to remove self-interactions of particles (particle absorbing its own emissions at a later or earlier time), because that produced incalculable infinite forces; but then they were unable to account for the Lamb shift, which does come from self-interaction; and then somehow Feynman made the leap to path integrals, and in the quantum framework they could deal with the infinities of self-interaction through renormalization.
It may seem like a big leap from the classical to the quantum picture. But classical dynamics can be expressed as wave motion in configuration space via the Hamilton-Jacobi equation, and it’s not a big step from the HJE to quantum mechanics. Also, doing anything practical with path integrals usually involves working with classical solutions to the equation of motion, which in the quantum theory have high amplitude, and then looking at corrections which come from neighboring histories.
It’s quite conceivable that these quantum deviations from classicality may result from the interference of forward causality and retrocausality. Maybe the Wheeler-Feynman theory just needs some extra ingredient, like micro time loops from general relativity, in order to become consistent. We would be dealing with a single-world model which is locally causal but not globally causal, in the sense that the future would also be shaped by the distribution of micro time loops, and that’s not determined by its current state. Our world would be one of an ensemble of self-contained, globally consistent “classical” histories, and the quantum probability calculus (including the Born rule) would just turn out to be how to do probability theory in a world where influences come from the future as well as from the past. For example, the Aharanov “two-state-vector formalism” might show up as the way to do statistical mechanics if you know yourself to be living in such an ensemble. There would be no ontological superpositions. Wavefunctions would just be “probability distributions with a future component”.
The status of these speculations is remarkably similar to the status of many worlds. The construction of an exact theory along these lines, with a clear explanation of how it connects to reality, remains elusive, but you can assemble suggestive facts from the quantum formalism to make it plausible, and there is a long tradition of people trying to make it work, one way or another: Wheeler and Feynman, John Cramer, Yakir Aharonov.
Practical QM contains the dualism of wavefunctions and classical observables. Many worlds reifies just the wavefunction and tries to find the observables in it. Retrocausality just keeps the classical part and tries to explain the wavefunction as something to do with forwards and backwards causality. Bohmian mechanics keeps the wavefunction and then fleshes out the classical part in a way governed by the wavefunction. Nomological Bohmian mechanics keeps the classical part of Bohmian mechanics, and replaces the wavefunction with an additional nonlocal potential in the classical equations of motion. If you could obtain that nonlocal potential from a local retrocausal theory, you would finally have an exact, single-world, deterministic explanation of quantum mechanics.