I have three pretty significant questions: Are you a strong rationalist (good with the formalisms of Occams Razor)? Are you at all familiar with String Theory (in the sense of Doing the basic equations)? If yes to both, what is your bayes goggles view on String Theory?
What on earth is the String Theory controversy about, and is it resolvable at a glance like QM’s MWI?
There isn’t a unified “string theory controversy”.
The battle-tested part of fundamental physics consists of one big intricate quantum field theory (the standard model, with all the quarks, leptons etc) and one non-quantum theory of gravity (general relativity). To go deeper, one wishes to explain the properties of the standard model (why those particles and those forces, why various “accidental symmetries” etc), and also to find a quantum theory of gravity. String theory is supposed to do both of these, but it also gets attacked on both fronts.
Rather than producing a unique prediction for the geometry of the extra dimensions, leading to unique and thus sharply falsifiable predictions for the particles and forces, present-day string theory can be defined on an enormous, possibly infinite number of backgrounds. And even with this enormous range of vacua to choose from, it’s still considered an achievement just to find something with a qualitative resemblance to the standard model. Computing e.g. the exact mass of the “electron” in one of these stringy standard models is still out of reach.
Here is a random example of a relatively recent work of string phenomenology, to give you an idea of what is considered progress. The abstract starts by saying that certain vacua are known which give rise to “the exact MSSM spectrum”. The MSSM is the standard model plus minimal supersymmetry. Then they point out that these vacua will also have to have an extra electromagnetism-like force (“gauged U(1)_B-L”). We don’t see such a force, so therefore the “B-L” photons must be heavy, and the gist of the paper is to point out that this can be achieved if one of the neutrino superpartners acts like a Higgs field (by “acquiring a vacuum expectation value”). In fact this paper doesn’t contain string calculations per se; it’s an argument at the level of quantum field theory, that the field-theory limit of these string models is potentially consistent with experiment.
That might not sound exciting, but in fact it’s characteristic, not just of string phenomenology, but of theoretical particle physics in general. Progress is incremental. Grand unified theories don’t explain the masses of the particles, but they can explain the charges. String theory hasn’t yet explained the masses, but it has the potential to do so, in that they will be set by the stabilized size and shape of the extra dimensions. The topology of the extra dimensions is (currently) a model-building choice, but once that choice is made, the masses should follow, they’re not free parameters as in field theory.
As for what might determine the topology of the extra dimensions, anthropic selection is a popular answer these days—and that has become another source of dissatisfaction for string theory’s critics, because it looks like another step back from predictivity. Except in very special cases like the cosmological constant, where a large value makes any kind of physical structure impossible, there’s enormous scope for handwaving explanations here… Actually, there are arguments that the different vacua of the “landscape” should be connected by quantum tunneling, so the vacuum we are in may be a long-lived metastable vacuum arrived at after many transitions in the primordial universe. But even if that’s true, it doesn’t tell you whether the number of metastable minima in the landscape is one or a googol. This is an aspect of string theory which is even harder than calculating the particle masses in a particular vacuum, judging by the amount of attention it gets. The empirical side of string theory is still dominated by incrementally refining the level of qualitative approximation to the standard model (including the standard cosmological model, “lambda CDM”) that is possible.
As for quantum gravity, the situation is somewhat different. String theory offers a particular solution to the problems of quantum gravity, like accounting for black hole entropy, preserving unitarity during Hawking evaporation, and making graviton behavior calculable. I’d say it is technically far ahead of any rival quantum gravity theory, but none of that stuff is observable. So approaches to quantum gravity which are much less impressive, but also much simpler, continue to have supporters.
I don’t do formal Bayes or Kolmogorov on a daily basis; in particle physics Bayes usually appears in deriving confidence limits. Still, I’m reasonably familiar with the formalism. As for string theory, my jest in the OP is quite accurate: I dunno nuffin’. I do have some friends who do string-theoretical calculations, but I’ve never been able to shake out an answer to the question of what, exactly, they’re calculating. My basic view of string theory has remained unchanged for several years: Come back when you have experimental predictions in an energy or luminosity range we’ll actually reach in the next decade or two. Kthxbye.
The controversy is, I suppose, that there’s a bunch of very excited theorists who have found all these problems they can sic their grad students on, problems which are hard enough to be interesting but still solvable in a few years of work; but they haven’t found any way of making, y’know, actual predictions of what will happen in current or planned experiments if their theory is correct. So the question is, is this a waste of perfectly good brains that ought to be doing something useful? The answer seems to me to be a value judgement, so I don’t think you can resolve it at a glance.
This is roughly what I can discern from outside academia in general (I’m 19 years old and at time of posting about to graduate the local equivalent of high-school).
What on earth is the String Theory controversy about, and is it resolvable at a glance like QM’s MWI?
I wonder how you resolve the MWI “at a glance”. There are strong opinions on both sides, and no convincing (to the other side) argument to resolve the disagreement. (This statement is an indisputable experimental fact.) If you mean that you are convinced by the arguments from your own camp, then I doubt that it counts as a resolution.
Also, the Occam’s razor is nearly always used by physicists informally, not calculationally (partly because Kolmogorov complexity is not computable).
As for the string theory, I don’t know how to use Bayes to evaluate it. On one hand, this model gives some hope of eventually finding something workable, since it provided a number of tantalizing hints, such as the holographic principle and various dualities. On the other hand, every testable prediction it has ever made has been successfully falsified. Unfortunately, there are few other competing theories. My guess is that if something better comes along, it will yield the string theory in some approximation.
I wonder how you resolve the MWI “at a glance”. There are strong opinions on both sides, and no convincing (to the other side) argument to resolve the disagreement. (This statement is an indisputable experimental fact.) If you mean that you are convinced by the arguments from your own camp, then I doubt that it counts as a resolution.
MagnetoHydroDynamics may find this most useful as an answer to his first question rather than to his question about string theory. It gives him significant information about your rationalist strengths and ability to apply Occams Razor usefully. To use the language above we could describe this in terms of ‘camps’. Magneto can identify you as not part of his desired camp and correctly use that to determine how much weight to place on your testimony in other areas. (Not belonging to his ‘camp’ you would naturally either disagree or take offence at his disrespect).
Evaluating ‘rationalist strengths’ via answers to questions about physics you don’t actually know well enough to evaluate anything, is also a very effective way to be stupid and reveal your own ignorance of QM.
As wedrifid says, this comment tells me that you are, regrettably, not as strong a Bayesian as I would wish many physicists were.
Resolving MWI at a glance involves looking at the Schroedinger equation and conclude that it gives rise to decoherence and that when decoherence gets large enough, the Schroedinger equation says nothing anormalous happens, it just keeps on being decoherent.
That is literally the whole of the argument, and to me saying that something extra and mysterious happens is stupid in an absolute sense. Run a two-particle, single-spatial dimension, time dependent sim of the Schroedinger equation, starting with a high level of quantum independence, and you will see decoherence as plain as day.
Decoherence is simple, falsifiable, and explains all hitherto observed data.
The Collapse postulate breaks CPT symmetry, violates conservation the quantum hamiltonian, violates Liouvilles theorem, violates relativistic locality, is non-linear, is non-unitary, is non-differentiable, inherently stochastic, poorly defined, anthropocentric and formulated in deep confusion.
As wedrifid says, this comment tells me that you are, regrettably, not as strong a Bayesian as I would wish many physicists were.
And your comment makes obvious that you are not a physicist, and have learned QM from someone who is not a physicist. Quick, without looking it up- what percentage of physicists subscribe to MWI? What are two alternative interpretations of QM besides Copenhagen and MWI?
Sure! The approach is the informative part, and I should have worded my post better to make that clearer. Something along the lines of “why do you believe that many physicists reject MWI for those reasons?” would have been less confrontational and probably more communicative.
And your comment makes obvious that you are not a physicist, and have learned QM from someone who is not a physicist.
Yes I am not a physicist, I will at best be a first year CS bachelor student in a little over a year from time of posting. I am, however, really good at mathematics. Good enough in fact to be able to solve partial differential equations in complex scalar fields, and simulate them accordingly with custom written C programs as a hobby.
I might not know the first equation of QTF, but I can write a Schroedinger-wave-packet equation, derive the time dependent differentialtion, discretize it and simulate it in a two dimensional discrete hilbert space with dependent potential wells and set the initial state to high independence.
Quick, without looking it up- what percentage of physicists subscribe to MWI?
I don’t know, but apparently not enough. I seem to misremember having heard a figure of more than half.
What are two alternative interpretations of QM besides Copenhagen and MWI?
Transitional Wave and Bhomian Mechanics, or however they are spelt. The former is something tricky to do with time-reversed wave-packets, the latter postulates point-shaped particles in addition to the wave packets and was disproven early on.
This is entirely the wrong attitude to have.
Yes it is, I am sorry, it was a rethorical slip up.
If you could look at the wavefunction and count the worlds by inspection, then these claims would have something to them. But you can’t. By inspection you can see, e.g., that a particular wavefunction contains two wavepackets, one of which is N times as high as the other. How do you go from that, to one outcome being N^2 times as frequent as the other?
You generally don’t. If I may in retrospect reword my argument, I will say that given the Schroedinger Equation there is nothing stopping decoherence from getting macroscopic. Why the Born Rule works, I have no idea, but I am pretty damn certain it has a non-mysterious explanation.
It seems I was confused about what terms were synonymous and what weren’t.
But The Copenhagen Interprentation is still stupid.
Wow, this comment was a fuckup. I meant to say strong things about Macroscopic Decoherence and accidentally came off as if I actually had an explanation of The Born Rule… Stupid illusion of transparency and fight-arguments.
Run a two-particle, single-spatial dimension, time dependent sim of the Schroedinger equation, starting with a high level of quantum independence, and you will see decoherence as plain as day.
Please feel free to post a link to such a sim. I’m almost willing to bet real money against it. That you would even propose that decoherence can be observed without including the environment in the simulation tells me how much of QM you really understand.
The Collapse postulate breaks CPT symmetry, violates conservation the quantum hamiltonian, violates Liouvilles theorem, violates relativistic locality, is non-linear, is non-unitary, is non-differentiable, inherently stochastic, poorly defined, anthropocentric and formulated in deep confusion.
You mean, the straw collapse EY constructed and happily demolished. The windmill you are fighting has nothing to do with the orthodox formulation of QM, which is perfectly compatible with decoherence.
The orthodox formulation of QM (given in Griffiths and most other modern QM texts) is the following:
Time evolution of an isolated system is governed by the time-dependent Schroedinger equation (this includes einselection when the system is no longer isolated).
The Born rule: after a measurement is performed on a system in a given state, the probability of observing a given eigenstate is given by the square modulus of the system’s state’s projection onto the eigenstate.
Note that there is no mention of collapse. The Born rule is the miracle step in the orthodox approach (and is an open problem in physics), just like it is in any other approach, including the MWI (EY admitted as much, if you want an argument from authority).
The interpretational confusion starts once you try to invent the reasons behind the Born rule. A proper scientific way to address the issue would be to construct a model which explains the Born rule AND makes other testable predictions separate from the orthodox QM. This conjunction is essential. There is no way to simply “dissolve” the question.
The orthodox formulation of QM (given in Griffiths and most other modern QM texts)...
Here’s Griffiths on what he calls the “orthodox position” on quantum indeterminacy. This is from pages 3-5 of his text:
It was the act of measurement that forced the particle to “take a stand”… Jordan said it most starkly: “Observations not only disturb what is to be measured, they produce it… We compel [the particle] to assume a definite position.”… Among physicists it has always been the most widely accepted position. Note, however, that if it is correct there is something very peculiar about the act of measurement...
We say that the wave function collapses upon measurement, to a spike… There are, then, two entirely distinct kinds of physical processes: “ordinary” ones, in which the wave function evolves in a leisurely fashion under the Schrodinger equation, and “measurements”, in which [it] suddenly and discontinuously collapses.
Going by Griffiths’ own account (and I picked Griffiths because he’s the authority you cited), what Eliezer says about the orthodox interpretation is not a strawman. In fact, Griffiths explicitly discusses the view you call the “orthodox formulation”, except he doesn’t call it “orthodox”. He describes it as an alternative to the orthodox position, and labels it the “agnostic position”.
I think your flavor of instrumentalism is a respectable position in the foundational debate, but to describe it as the standard position is incorrect. I think there was a time when physicists in general had a more operationalist bent, but things have changed.
Hmm, I suppose my personal classification is slightly different. Thanks for pointing that out.
The agnostic position is “shut up and calculate”, which is basically resigning to one’s inability to model the Born rule with anything better.
The instrumentalist position is to admit that doing research related to the Born rule origins is essential for progress in understanding the fundamentals of QM, but to also acknowledge that interpretations are not interesting physical models and at best have only an inspirational value.
The realist position (hidden variables are fundamental, collapse is fundamental, or MWI is fundamental, or Bohmian mechanics is fundamental) is the one that is easiest to falsify, as soon as it sticks its neck out with testable predictions (Bohm and collapse do not play well with relativity, local hidden variables run afoul of the Bell theorem, MWI makes no testable predictions whatsoever).
I suppose the confusion is that last paragraph: “There are, then, two entirely distinct kinds of physical processes: “ordinary” ones, in which the wave function evolves in a leisurely fashion under the Schrodinger equation, and “measurements”, in which [it] suddenly and discontinuously collapses.” This is a realist position, so I don’t favor it, because it does not make any testable predictions.
I think your flavor of instrumentalism is a respectable position in the foundational debate, but to describe it as the standard position is incorrect.
OK, I will stop calling it standard, just instrumental.
I think there was a time when physicists in general had a more operationalist bent, but things have changed.
Okay, the Orthodox QM is an informal specification of anticipated experimental results, and acknowledges decoherence as a thing. That is good to know.
My base claim is that decoherence can and will become macroscopic given time. Some physicists seem to disagree. Why? To my best expertise it is obviously implied by the mathematics behind it.
I am well aware the Born Rule is a mystery. Where the Born Probabilities come from, idk. Mangled Worlds seems like it might have the structure of a good explanation, it smells right, even if it isn’t.
(EY admitted as much, if you want an argument from authority)
Okay, the Orthodox QM is an informal specification of anticipated experimental results, and acknowledges decoherence as a thing. That is good to know.
OK, as pragmatist pointed out, calling it orthodox is misleading. Sorry. From now on I’ll be calling it instrumentalist. As for “informal”, it’s as formal as it gets, pure math.
My base claim is that decoherence can and will become macroscopic given time.
That’s an experimental fact, you don’t need to claim anything.
Some physicists seem to disagree.
Really? Who?
Why? To my best expertise it is obviously implied by the mathematics behind it.
Feel free to outline the math. The best sort-of-derivation so far, as far as I know, is given by Zurek and is known as einselection.
In addition to these formal axioms one needs a rudimentary
interpretation relating the formal part to experiments.
The following minimal interpretation seems to be universally
accepted.
MI. Upon measuring at times t_l (l=1,...,n) a vector X of observables
with commuting components, for a large collection of independent
identical (particular) systems closed for times t<t_l, all in the same
state
rho_0 = lim_{t to t_l from below} rho(t)
(one calls such systems identically prepared), the measurement
results are statistically consistent with independent realizations
of a random vector X with measure as defined in axiom A5.
Note that MI is no longer a formal statement since it neither defines
what ‘measuring’ is, nor what ‘measurement results’ are and what
‘statistically consistent’ or ‘independent identical system’ means.
Thus MI has no mathematical meaning—it is not an axiom, but already
part of the interpretation of formal quantum mechanics.
[...]
The lack of precision in statement MI is on purpose, since it allows
the statement to be agreeable to everyone in its vagueness; different
philosophical schools can easily fill it with their own understanding
of the terms in a way consistent with the remainder.
[...]
MI is what every interpretation I know of assumes (and has to assume)
at least implicitly in order to make contact with experiments.
Indeed, all interpretations I know of assume much more, but they
differ a lot in what they assume beyond MI.
Everything beyond MI seems to be controversial. In particular,
already what constitutes a measurement of X is controversial.
(E.g., reading a pointer, different readers may get marginally
different results. What is the true pointer reading?)
So what is the orthodox formulation of QM, which is perfectly compatible with decoherance and doesn’t resemble the straw man? I’m sorry if you’ve posted this elsewhere, but I’d really like to know what you think.
I have three pretty significant questions: Are you a strong rationalist (good with the formalisms of Occams Razor)? Are you at all familiar with String Theory (in the sense of Doing the basic equations)? If yes to both, what is your bayes goggles view on String Theory?
What on earth is the String Theory controversy about, and is it resolvable at a glance like QM’s MWI?
There isn’t a unified “string theory controversy”.
The battle-tested part of fundamental physics consists of one big intricate quantum field theory (the standard model, with all the quarks, leptons etc) and one non-quantum theory of gravity (general relativity). To go deeper, one wishes to explain the properties of the standard model (why those particles and those forces, why various “accidental symmetries” etc), and also to find a quantum theory of gravity. String theory is supposed to do both of these, but it also gets attacked on both fronts.
Rather than producing a unique prediction for the geometry of the extra dimensions, leading to unique and thus sharply falsifiable predictions for the particles and forces, present-day string theory can be defined on an enormous, possibly infinite number of backgrounds. And even with this enormous range of vacua to choose from, it’s still considered an achievement just to find something with a qualitative resemblance to the standard model. Computing e.g. the exact mass of the “electron” in one of these stringy standard models is still out of reach.
Here is a random example of a relatively recent work of string phenomenology, to give you an idea of what is considered progress. The abstract starts by saying that certain vacua are known which give rise to “the exact MSSM spectrum”. The MSSM is the standard model plus minimal supersymmetry. Then they point out that these vacua will also have to have an extra electromagnetism-like force (“gauged U(1)_B-L”). We don’t see such a force, so therefore the “B-L” photons must be heavy, and the gist of the paper is to point out that this can be achieved if one of the neutrino superpartners acts like a Higgs field (by “acquiring a vacuum expectation value”). In fact this paper doesn’t contain string calculations per se; it’s an argument at the level of quantum field theory, that the field-theory limit of these string models is potentially consistent with experiment.
That might not sound exciting, but in fact it’s characteristic, not just of string phenomenology, but of theoretical particle physics in general. Progress is incremental. Grand unified theories don’t explain the masses of the particles, but they can explain the charges. String theory hasn’t yet explained the masses, but it has the potential to do so, in that they will be set by the stabilized size and shape of the extra dimensions. The topology of the extra dimensions is (currently) a model-building choice, but once that choice is made, the masses should follow, they’re not free parameters as in field theory.
As for what might determine the topology of the extra dimensions, anthropic selection is a popular answer these days—and that has become another source of dissatisfaction for string theory’s critics, because it looks like another step back from predictivity. Except in very special cases like the cosmological constant, where a large value makes any kind of physical structure impossible, there’s enormous scope for handwaving explanations here… Actually, there are arguments that the different vacua of the “landscape” should be connected by quantum tunneling, so the vacuum we are in may be a long-lived metastable vacuum arrived at after many transitions in the primordial universe. But even if that’s true, it doesn’t tell you whether the number of metastable minima in the landscape is one or a googol. This is an aspect of string theory which is even harder than calculating the particle masses in a particular vacuum, judging by the amount of attention it gets. The empirical side of string theory is still dominated by incrementally refining the level of qualitative approximation to the standard model (including the standard cosmological model, “lambda CDM”) that is possible.
As for quantum gravity, the situation is somewhat different. String theory offers a particular solution to the problems of quantum gravity, like accounting for black hole entropy, preserving unitarity during Hawking evaporation, and making graviton behavior calculable. I’d say it is technically far ahead of any rival quantum gravity theory, but none of that stuff is observable. So approaches to quantum gravity which are much less impressive, but also much simpler, continue to have supporters.
Great reply, thank you for clearing up my confusion.
I don’t do formal Bayes or Kolmogorov on a daily basis; in particle physics Bayes usually appears in deriving confidence limits. Still, I’m reasonably familiar with the formalism. As for string theory, my jest in the OP is quite accurate: I dunno nuffin’. I do have some friends who do string-theoretical calculations, but I’ve never been able to shake out an answer to the question of what, exactly, they’re calculating. My basic view of string theory has remained unchanged for several years: Come back when you have experimental predictions in an energy or luminosity range we’ll actually reach in the next decade or two. Kthxbye.
The controversy is, I suppose, that there’s a bunch of very excited theorists who have found all these problems they can sic their grad students on, problems which are hard enough to be interesting but still solvable in a few years of work; but they haven’t found any way of making, y’know, actual predictions of what will happen in current or planned experiments if their theory is correct. So the question is, is this a waste of perfectly good brains that ought to be doing something useful? The answer seems to me to be a value judgement, so I don’t think you can resolve it at a glance.
This is roughly what I can discern from outside academia in general (I’m 19 years old and at time of posting about to graduate the local equivalent of high-school).
I wonder how you resolve the MWI “at a glance”. There are strong opinions on both sides, and no convincing (to the other side) argument to resolve the disagreement. (This statement is an indisputable experimental fact.) If you mean that you are convinced by the arguments from your own camp, then I doubt that it counts as a resolution.
Also, the Occam’s razor is nearly always used by physicists informally, not calculationally (partly because Kolmogorov complexity is not computable).
As for the string theory, I don’t know how to use Bayes to evaluate it. On one hand, this model gives some hope of eventually finding something workable, since it provided a number of tantalizing hints, such as the holographic principle and various dualities. On the other hand, every testable prediction it has ever made has been successfully falsified. Unfortunately, there are few other competing theories. My guess is that if something better comes along, it will yield the string theory in some approximation.
MagnetoHydroDynamics may find this most useful as an answer to his first question rather than to his question about string theory. It gives him significant information about your rationalist strengths and ability to apply Occams Razor usefully. To use the language above we could describe this in terms of ‘camps’. Magneto can identify you as not part of his desired camp and correctly use that to determine how much weight to place on your testimony in other areas. (Not belonging to his ‘camp’ you would naturally either disagree or take offence at his disrespect).
Evaluating ‘rationalist strengths’ via answers to questions about physics you don’t actually know well enough to evaluate anything, is also a very effective way to be stupid and reveal your own ignorance of QM.
Very astute observation.
As wedrifid says, this comment tells me that you are, regrettably, not as strong a Bayesian as I would wish many physicists were.
Resolving MWI at a glance involves looking at the Schroedinger equation and conclude that it gives rise to decoherence and that when decoherence gets large enough, the Schroedinger equation says nothing anormalous happens, it just keeps on being decoherent.
That is literally the whole of the argument, and to me saying that something extra and mysterious happens is stupid in an absolute sense. Run a two-particle, single-spatial dimension, time dependent sim of the Schroedinger equation, starting with a high level of quantum independence, and you will see decoherence as plain as day.
Decoherence is simple, falsifiable, and explains all hitherto observed data.
The Collapse postulate breaks CPT symmetry, violates conservation the quantum hamiltonian, violates Liouvilles theorem, violates relativistic locality, is non-linear, is non-unitary, is non-differentiable, inherently stochastic, poorly defined, anthropocentric and formulated in deep confusion.
Pick your side.
And your comment makes obvious that you are not a physicist, and have learned QM from someone who is not a physicist. Quick, without looking it up- what percentage of physicists subscribe to MWI? What are two alternative interpretations of QM besides Copenhagen and MWI?
This is entirely the wrong attitude to have.
I’m a physicist and I wouldn’t know that myself. Especially because I seem to recall different surveys giving vastly different results.
Sure! The approach is the informative part, and I should have worded my post better to make that clearer. Something along the lines of “why do you believe that many physicists reject MWI for those reasons?” would have been less confrontational and probably more communicative.
Yes I am not a physicist, I will at best be a first year CS bachelor student in a little over a year from time of posting. I am, however, really good at mathematics. Good enough in fact to be able to solve partial differential equations in complex scalar fields, and simulate them accordingly with custom written C programs as a hobby.
I might not know the first equation of QTF, but I can write a Schroedinger-wave-packet equation, derive the time dependent differentialtion, discretize it and simulate it in a two dimensional discrete hilbert space with dependent potential wells and set the initial state to high independence.
I don’t know, but apparently not enough. I seem to misremember having heard a figure of more than half.
Transitional Wave and Bhomian Mechanics, or however they are spelt. The former is something tricky to do with time-reversed wave-packets, the latter postulates point-shaped particles in addition to the wave packets and was disproven early on.
Yes it is, I am sorry, it was a rethorical slip up.
If you could look at the wavefunction and count the worlds by inspection, then these claims would have something to them. But you can’t. By inspection you can see, e.g., that a particular wavefunction contains two wavepackets, one of which is N times as high as the other. How do you go from that, to one outcome being N^2 times as frequent as the other?
You generally don’t. If I may in retrospect reword my argument, I will say that given the Schroedinger Equation there is nothing stopping decoherence from getting macroscopic. Why the Born Rule works, I have no idea, but I am pretty damn certain it has a non-mysterious explanation.
It seems I was confused about what terms were synonymous and what weren’t.
But The Copenhagen Interprentation is still stupid.
Wow, this comment was a fuckup. I meant to say strong things about Macroscopic Decoherence and accidentally came off as if I actually had an explanation of The Born Rule… Stupid illusion of transparency and fight-arguments.
Please feel free to post a link to such a sim. I’m almost willing to bet real money against it. That you would even propose that decoherence can be observed without including the environment in the simulation tells me how much of QM you really understand.
You mean, the straw collapse EY constructed and happily demolished. The windmill you are fighting has nothing to do with the orthodox formulation of QM, which is perfectly compatible with decoherence.
What is the orthodox formulation of QM? Link?
The orthodox formulation of QM (given in Griffiths and most other modern QM texts) is the following:
Time evolution of an isolated system is governed by the time-dependent Schroedinger equation (this includes einselection when the system is no longer isolated).
The Born rule: after a measurement is performed on a system in a given state, the probability of observing a given eigenstate is given by the square modulus of the system’s state’s projection onto the eigenstate.
Note that there is no mention of collapse. The Born rule is the miracle step in the orthodox approach (and is an open problem in physics), just like it is in any other approach, including the MWI (EY admitted as much, if you want an argument from authority).
The interpretational confusion starts once you try to invent the reasons behind the Born rule. A proper scientific way to address the issue would be to construct a model which explains the Born rule AND makes other testable predictions separate from the orthodox QM. This conjunction is essential. There is no way to simply “dissolve” the question.
Here’s Griffiths on what he calls the “orthodox position” on quantum indeterminacy. This is from pages 3-5 of his text:
Going by Griffiths’ own account (and I picked Griffiths because he’s the authority you cited), what Eliezer says about the orthodox interpretation is not a strawman. In fact, Griffiths explicitly discusses the view you call the “orthodox formulation”, except he doesn’t call it “orthodox”. He describes it as an alternative to the orthodox position, and labels it the “agnostic position”.
I think your flavor of instrumentalism is a respectable position in the foundational debate, but to describe it as the standard position is incorrect. I think there was a time when physicists in general had a more operationalist bent, but things have changed.
Hmm, I suppose my personal classification is slightly different. Thanks for pointing that out.
The agnostic position is “shut up and calculate”, which is basically resigning to one’s inability to model the Born rule with anything better.
The instrumentalist position is to admit that doing research related to the Born rule origins is essential for progress in understanding the fundamentals of QM, but to also acknowledge that interpretations are not interesting physical models and at best have only an inspirational value.
The realist position (hidden variables are fundamental, collapse is fundamental, or MWI is fundamental, or Bohmian mechanics is fundamental) is the one that is easiest to falsify, as soon as it sticks its neck out with testable predictions (Bohm and collapse do not play well with relativity, local hidden variables run afoul of the Bell theorem, MWI makes no testable predictions whatsoever).
I suppose the confusion is that last paragraph: “There are, then, two entirely distinct kinds of physical processes: “ordinary” ones, in which the wave function evolves in a leisurely fashion under the Schrodinger equation, and “measurements”, in which [it] suddenly and discontinuously collapses.” This is a realist position, so I don’t favor it, because it does not make any testable predictions.
OK, I will stop calling it standard, just instrumental.
How?
Okay, the Orthodox QM is an informal specification of anticipated experimental results, and acknowledges decoherence as a thing. That is good to know.
My base claim is that decoherence can and will become macroscopic given time. Some physicists seem to disagree. Why? To my best expertise it is obviously implied by the mathematics behind it.
I am well aware the Born Rule is a mystery. Where the Born Probabilities come from, idk. Mangled Worlds seems like it might have the structure of a good explanation, it smells right, even if it isn’t.
Now that was uncalled for.
OK, as pragmatist pointed out, calling it orthodox is misleading. Sorry. From now on I’ll be calling it instrumentalist. As for “informal”, it’s as formal as it gets, pure math.
That’s an experimental fact, you don’t need to claim anything.
Really? Who?
Feel free to outline the math. The best sort-of-derivation so far, as far as I know, is given by Zurek and is known as einselection.
Perception of groups are often skewed. Mine was.
That update out of the way, why are we arguing? We do not disagree.
Aumann ftw!
I wouldn’t call it “orthodox”, but see this:
[...]
[...]
So what is the orthodox formulation of QM, which is perfectly compatible with decoherance and doesn’t resemble the straw man? I’m sorry if you’ve posted this elsewhere, but I’d really like to know what you think.
See my reply to MagnetoHydroDynamics.