It would seem to suggest that if I want to be rich I should buy a bunch of lottery tickets and then kill myself when I don’t win.
I have not seen the local discussion of MWI and everett branches, but my “conclusion” in the past has been that MWI is a defect of the map maker and not a feature of the territory. I’d be happy to be pointed to something that would change my mind or at least rock it a bit, but for now it looks like angels dancing on the heads of pins. Has somebody provided an experiment that would rule MWI in or out? If so, what was the result? If not, then how is a consideration of MWI anything other than confusing the map with the territory?
If I have fallen in to Appeal to Consequences with my original post, than my bad.
It would seem to suggest that if I want to be rich I should buy a bunch of lottery tickets and then kill myself when I don’t win.
I don’t think that’s the case, but even if it were, using that to argue against the likelihood of MWI would be Appeal to Consequences.
I have not seen the local discussion of MWI and everett branches, but my “conclusion” in the past has been that MWI is a defect of the map maker and not a feature of the territory.
That’s what I used to think :)
I’d be happy to be pointed to something that would change my mind or at least rock it a bit
If you’re prepared for a long but rewarding read, Eliezer’s Quantum Physics Sequence is a non-mysterious introduction to quantum mechanics, intended to be accessible to anyone who can grok algebra and complex numbers. Cleaning up the old confusion about QM is used to introduce basic issues in rationality (such as the technical version of Occam’s Razor), epistemology, reductionism, naturalism, and philosophy of science.
Has somebody provided an experiment that would rule MWI in or out? If so, what was the result? If not, then how is a consideration of MWI anything other than confusing the map with the territory?
The idea is that MWI is the simplest explanation that fits the data, by the definition of simplest that has proven to be most useful when predicting which of different theories that match the same data is actually correct.
It would seem to suggest that if I want to be rich I should buy a bunch of lottery tickets and then kill myself when I don’t win.
I have not seen the local discussion of MWI and everett branches, but my “conclusion” in the past has been that MWI is a defect of the map maker and not a feature of the territory. I’d be happy to be pointed to something that would change my mind or at least rock it a bit, but for now it looks like angels dancing on the heads of pins. Has somebody provided an experiment that would rule MWI in or out? If so, what was the result? If not, then how is a consideration of MWI anything other than confusing the map with the territory?
If I have fallen in to Appeal to Consequences with my original post, than my bad.
I don’t think that’s the case, but even if it were, using that to argue against the likelihood of MWI would be Appeal to Consequences.
That’s what I used to think :)
If you’re prepared for a long but rewarding read, Eliezer’s Quantum Physics Sequence is a non-mysterious introduction to quantum mechanics, intended to be accessible to anyone who can grok algebra and complex numbers. Cleaning up the old confusion about QM is used to introduce basic issues in rationality (such as the technical version of Occam’s Razor), epistemology, reductionism, naturalism, and philosophy of science.
For a shorter sequence that concentrates on why MWI wins, see And the Winner is… Many-Worlds!
The idea is that MWI is the simplest explanation that fits the data, by the definition of simplest that has proven to be most useful when predicting which of different theories that match the same data is actually correct.