This is an example of Eliezer’s extreme overconfidence. As he rightly points out, we cannot in fact construct a quantum mechanical model of a 747. Yet he asserts as absolute fact that such a model would be more accurate than our usual models.
This is the point made in “A Different Universe” by Robert B Laughlin, a Nobel Prize-winning physicist. He is a solid state physicist and argues that
Going from a more “fundamental” to “higher” level requires computations that are inprinciple intractable. You cannot possibly avoid the use of levels of analysis. It is not just a matter of computational convenience. [I admit that the universe does the calculation but we have no idea how].
Laughton won his Nobel for “explaining” the fractional quantum Hall effect before anyone else did. But he casts scorn on such explanations, pointing out that of the 27 solid phases of water, not one was predicted, but all have been “explained” after the fact.
Phenomena at higher levels are often, even usually, insensitive to the nature of the levels below. A good example is the statistical mechanics of gases, which hardly changed when our view of the atoms that make up gases changed from hard Newtonian balls to fuzzy quantum blobs.
There is plenty of evidence that “fundamental” physics is just the statistical mechanics of a lower layer eg all those “virtual particles”—what are they all about? “Empty” space seems to be about as empty as the super bowl on the day of the big game. There is no evidence that “fundamental” physics is at all fundamental in fact. We don’t even have any indication how many layers there are before we get to the turtle at the bottom, if there is one.
Doesn’t the fact that the universe is carrying out these computations mean it is feasible in principle? Our current ignorance of how this is done is irrelevant. Am I missing something?
statistical mechanics of gases, which hardly changed when our view of the atoms that make up gases changed from hard Newtonian balls to fuzzy quantum blobs
This seems to be a map/territory confusion. A change in our model shouldn’t change what we observe. If our high level theories changed dramatically, that would be a bad sign.
This is the point made in “A Different Universe” by Robert B Laughlin, a Nobel Prize-winning physicist. He is a solid state physicist and argues that
Going from a more “fundamental” to “higher” level requires computations that are in principle intractable. You cannot possibly avoid the use of levels of analysis. It is not just a matter of computational convenience. [I admit that the universe does the calculation but we have no idea how].
Laughton won his Nobel for “explaining” the fractional quantum Hall effect before anyone else did. But he casts scorn on such explanations, pointing out that of the 27 solid phases of water, not one was predicted, but all have been “explained” after the fact.
Phenomena at higher levels are often, even usually, insensitive to the nature of the levels below. A good example is the statistical mechanics of gases, which hardly changed when our view of the atoms that make up gases changed from hard Newtonian balls to fuzzy quantum blobs.
There is plenty of evidence that “fundamental” physics is just the statistical mechanics of a lower layer eg all those “virtual particles”—what are they all about? “Empty” space seems to be about as empty as the super bowl on the day of the big game. There is no evidence that “fundamental” physics is at all fundamental in fact. We don’t even have any indication how many layers there are before we get to the turtle at the bottom, if there is one.
Doesn’t the fact that the universe is carrying out these computations mean it is feasible in principle? Our current ignorance of how this is done is irrelevant. Am I missing something?
This seems to be a map/territory confusion. A change in our model shouldn’t change what we observe. If our high level theories changed dramatically, that would be a bad sign.
It makes the universe Not a Computer in principle.