“so Gisin’s musings… are guaranteed to be not a step in any progress of the understanding of physics.”
What is your epistemic justification for asserting such a guarantee of failure? Of course, any new speculative idea in theoretical physics is far from likely to be adopted as part of the core theory, but you are making a much stronger claim by saying that it will not even be “a step in any progress of the understanding of physics”. Even ideas that are eventually rejected as false, are often useful for developing understanding. Gisin’s papers ask physicists to consider their unexamined assumptions about the nature of math itself, which seems at least like a fruitful path of inquiry, even if it won’t necessarily lead to any major breakthroughs.
“mathematical proofs are as much observations as anything else. Just because they happen in one’s head or with a pencil on paper, they are still observations.”
This reminds me of John Locke’s view that mathematical truths come from observation of internal states. That is an interesting perspective, but I’m not sure it an hold up to scrutiny. The biggest issue with it seems to be that in order to evaluate the evidence provided by empirical observations we must have a rational framework which includes logic and math. If logic and math themselves were simply observational, then we have no framework for evaluating the evidence provided by those observations. Perhaps you can give an alternative account of how we evaluate evidence without pre-supposing a rational framework.
“The difficulty of calculating a far-away digit in the decimal expansion of pi has nothing to do with pi itself: you can perfectly well define it as the ratio of circumference to diameter, or as a limit of some series”
I agree with this statement. I think though it misses the point I was elaborating about Brouwer’s concept of choice sequences. The issue isn’t that we can’t define a sequence that is equivalent to the infinite expansion of pi, I think it is rather that for any real quantity we an never be certain that it will continue to obey the lawlike expansion into the future. So the issue isn’t the “difficulty of calculating a far-away digit” the issue is that no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern. No matter how many digits of pi a real number contains, the next digit might suddenly be something other than pi (in which case we would say retrospectively that the real number was never equal to pi in the first place). This is actually what we observe, if we are to say measure the ratio of a jar lid’s diameter to it’s circumference. The first few digits will match pi, but then as we to smaller scales it will deviate.
″...the idea that Einstein’s equations are somehow unique in terms of being timeless is utterly false”
I made no claim that they are unique in this regard.
“mathematical proofs are as much observations as anything else. Just because they happen in one’s head or with a pencil on paper, they are still observations.”
I think this is better explained as:
We try to do math, but we can make mistakes.*
If two people evaluate an arithmetic expression the same way, but one makes a mistake, then they might get different answers.
*Other examples:
1. You can try to create a mathematical proof. But if you make a mistake, it might be wrong (even if the premises are right).
2. An incorrect proof, a typo, or something on your computer screen?
A proof might have a mistake in it and thus “be invalid”. But it could also have a typo, which if corrected yields a “valid proof”.
Or, the proof might not have a mistake in it—you could have misread it, and what it says is different from what you saw. (Someone can also summarize a proof badly.)
If the copy of the proof you have is different from the original errors (or changes) could have been introduced along the way.
The Einstein equation was singled out in the Quanta magazine article. I respect the author, she wrote a lot of good articles for Quanta, but this was quite misleading.
I don’t understand your second last point. Are you talking about a mathematical algorithm or about a physical measurement? ” no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern”—what pattern?
The biggest issue with it seems to be that in order to evaluate the evidence provided by empirical observations we must have a rational framework which includes logic and math. If logic and math themselves were simply observational, then we have no framework for evaluating the evidence provided by those observations.
No, we don’t. And yes, they are. We start with some innate abilities of the brain, add the culture we are brought in, then develop models of empirical observations, whatever they are. 1+1=2 is an abstraction of various empirical observations, be in counting sheep or in mathematical proofs. Logic and math co-develop with increasingly complex models and increasingly non-trivial observations, there is no “we need logic and math to evaluate evidence”. If you look through the history of science, math was being developed alongside physics, as one of the tools. In that sense the Noether theorem, for example, is akin to, say, a new kind of a telescope.
What is your epistemic justification for asserting such a guarantee of failure?
Because they are of the type that is “not even wrong”. The standard math works just fine for both GR and QM, the two main issues are conceptual, not mathematical: How does the (nonlinear) projection postulate emerge from the linear evolution (and no, MWI is not a useful “answer”, it has zero predictive power), and how do QM and GR mesh at the mesoscopic scale (i.e. what are the gravitational effects of a spatially separated entangled state?).
Thanks for your comment. My replies are below.
“so Gisin’s musings… are guaranteed to be not a step in any progress of the understanding of physics.”
What is your epistemic justification for asserting such a guarantee of failure? Of course, any new speculative idea in theoretical physics is far from likely to be adopted as part of the core theory, but you are making a much stronger claim by saying that it will not even be “a step in any progress of the understanding of physics”. Even ideas that are eventually rejected as false, are often useful for developing understanding. Gisin’s papers ask physicists to consider their unexamined assumptions about the nature of math itself, which seems at least like a fruitful path of inquiry, even if it won’t necessarily lead to any major breakthroughs.
“mathematical proofs are as much observations as anything else. Just because they happen in one’s head or with a pencil on paper, they are still observations.”
This reminds me of John Locke’s view that mathematical truths come from observation of internal states. That is an interesting perspective, but I’m not sure it an hold up to scrutiny. The biggest issue with it seems to be that in order to evaluate the evidence provided by empirical observations we must have a rational framework which includes logic and math. If logic and math themselves were simply observational, then we have no framework for evaluating the evidence provided by those observations. Perhaps you can give an alternative account of how we evaluate evidence without pre-supposing a rational framework.
“The difficulty of calculating a far-away digit in the decimal expansion of pi has nothing to do with pi itself: you can perfectly well define it as the ratio of circumference to diameter, or as a limit of some series”
I agree with this statement. I think though it misses the point I was elaborating about Brouwer’s concept of choice sequences. The issue isn’t that we can’t define a sequence that is equivalent to the infinite expansion of pi, I think it is rather that for any real quantity we an never be certain that it will continue to obey the lawlike expansion into the future. So the issue isn’t the “difficulty of calculating a far-away digit” the issue is that no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern. No matter how many digits of pi a real number contains, the next digit might suddenly be something other than pi (in which case we would say retrospectively that the real number was never equal to pi in the first place). This is actually what we observe, if we are to say measure the ratio of a jar lid’s diameter to it’s circumference. The first few digits will match pi, but then as we to smaller scales it will deviate.
″...the idea that Einstein’s equations are somehow unique in terms of being timeless is utterly false”
I made no claim that they are unique in this regard.
I think this is better explained as:
We try to do math, but we can make mistakes.*
If two people evaluate an arithmetic expression the same way, but one makes a mistake, then they might get different answers.
*Other examples:
1. You can try to create a mathematical proof. But if you make a mistake, it might be wrong (even if the premises are right).
2. An incorrect proof, a typo, or something on your computer screen?
A proof might have a mistake in it and thus “be invalid”. But it could also have a typo, which if corrected yields a “valid proof”.
Or, the proof might not have a mistake in it—you could have misread it, and what it says is different from what you saw. (Someone can also summarize a proof badly.)
If the copy of the proof you have is different from the original errors (or changes) could have been introduced along the way.
Let me reply to the last one first :)
The Einstein equation was singled out in the Quanta magazine article. I respect the author, she wrote a lot of good articles for Quanta, but this was quite misleading.
I don’t understand your second last point. Are you talking about a mathematical algorithm or about a physical measurement? ” no matter how many digits we observe following the law like pattern, the future digits may still deviate from that pattern”—what pattern?
No, we don’t. And yes, they are. We start with some innate abilities of the brain, add the culture we are brought in, then develop models of empirical observations, whatever they are. 1+1=2 is an abstraction of various empirical observations, be in counting sheep or in mathematical proofs. Logic and math co-develop with increasingly complex models and increasingly non-trivial observations, there is no “we need logic and math to evaluate evidence”. If you look through the history of science, math was being developed alongside physics, as one of the tools. In that sense the Noether theorem, for example, is akin to, say, a new kind of a telescope.
Because they are of the type that is “not even wrong”. The standard math works just fine for both GR and QM, the two main issues are conceptual, not mathematical: How does the (nonlinear) projection postulate emerge from the linear evolution (and no, MWI is not a useful “answer”, it has zero predictive power), and how do QM and GR mesh at the mesoscopic scale (i.e. what are the gravitational effects of a spatially separated entangled state?).