The Einstein equation was singled out in the Quanta magazine article. I respect the author, she wrote a lot of good articles for Quanta, but this was quite misleading.
I don’t understand your second last point. Are you talking about a mathematical algorithm or about a physical measurement? ” no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern”—what pattern?
The biggest issue with it seems to be that in order to evaluate the evidence provided by empirical observations we must have a rational framework which includes logic and math. If logic and math themselves were simply observational, then we have no framework for evaluating the evidence provided by those observations.
No, we don’t. And yes, they are. We start with some innate abilities of the brain, add the culture we are brought in, then develop models of empirical observations, whatever they are. 1+1=2 is an abstraction of various empirical observations, be in counting sheep or in mathematical proofs. Logic and math co-develop with increasingly complex models and increasingly non-trivial observations, there is no “we need logic and math to evaluate evidence”. If you look through the history of science, math was being developed alongside physics, as one of the tools. In that sense the Noether theorem, for example, is akin to, say, a new kind of a telescope.
What is your epistemic justification for asserting such a guarantee of failure?
Because they are of the type that is “not even wrong”. The standard math works just fine for both GR and QM, the two main issues are conceptual, not mathematical: How does the (nonlinear) projection postulate emerge from the linear evolution (and no, MWI is not a useful “answer”, it has zero predictive power), and how do QM and GR mesh at the mesoscopic scale (i.e. what are the gravitational effects of a spatially separated entangled state?).
Let me reply to the last one first :)
The Einstein equation was singled out in the Quanta magazine article. I respect the author, she wrote a lot of good articles for Quanta, but this was quite misleading.
I don’t understand your second last point. Are you talking about a mathematical algorithm or about a physical measurement? ” no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern”—what pattern?
No, we don’t. And yes, they are. We start with some innate abilities of the brain, add the culture we are brought in, then develop models of empirical observations, whatever they are. 1+1=2 is an abstraction of various empirical observations, be in counting sheep or in mathematical proofs. Logic and math co-develop with increasingly complex models and increasingly non-trivial observations, there is no “we need logic and math to evaluate evidence”. If you look through the history of science, math was being developed alongside physics, as one of the tools. In that sense the Noether theorem, for example, is akin to, say, a new kind of a telescope.
Because they are of the type that is “not even wrong”. The standard math works just fine for both GR and QM, the two main issues are conceptual, not mathematical: How does the (nonlinear) projection postulate emerge from the linear evolution (and no, MWI is not a useful “answer”, it has zero predictive power), and how do QM and GR mesh at the mesoscopic scale (i.e. what are the gravitational effects of a spatially separated entangled state?).