The importance of Turing machines lies in this: every simple model of computation anyone can come up with is either less capable than the Turing machines (e.g. finite state machines with no infinite tape) or exactly equal in capability.
I’m not sure what work “simple” is doing there. There are theoretical models of hypercomputation (more power than Turing machines) , but we can’t build them, so the Turing limit appears to be a physical fact, not a mathematical one.
“Simple” is indeed doing no work and should have been omitted.
The thesis that computation beyond the Turing limit is physically impossible has also been proposed. We can imagine various sorts of hypercomputation, but as far as we know, they can’t be built. This version of the general idea that Turing-uncomputable functions cannot be computed is known as the physical Church-Turing thesis. In principle, given some particular mathematical theory of physics, one might be able to prove it as a theorem.
It has been speculated that quantum physics might reopen the question by allowing some new ways of calculating things that are not possible in classical physics. I think Scott Aaronson has written on the subject, and has proposed that if a theory of physics does allow computability beyond Turing, that in itself is evidence against the theory.
I think Scott Aaronson has written on the subject, and has proposed that if a theory of physics does allow computability beyond Turing, that in itself is evidence against the theory.
I would disagree, at least I wouldn’t give it much evidence against.
In some sense, this is me being annoyed at how much people think that the principle of normality/Egan’s law universally applies. This is a very clear failure case of the principle of normality, because conditional on hyper computers being buildable, then it doesn’t add up to normality, but to extremes.
It has been speculated that quantum physics might reopen the question by allowing some new ways of calculating things that are not possible in classical physics.
No, quantum physics does not allow you to make computers beyond Turing machines.
This. There are scenarios where mathematically speaking, you can construct machines more powerful than Turing machines, but they rely on physics that as far as we know can’t be instantiated.
I’m not sure what work “simple” is doing there. There are theoretical models of hypercomputation (more power than Turing machines) , but we can’t build them, so the Turing limit appears to be a physical fact, not a mathematical one.
“Simple” is indeed doing no work and should have been omitted.
The thesis that computation beyond the Turing limit is physically impossible has also been proposed. We can imagine various sorts of hypercomputation, but as far as we know, they can’t be built. This version of the general idea that Turing-uncomputable functions cannot be computed is known as the physical Church-Turing thesis. In principle, given some particular mathematical theory of physics, one might be able to prove it as a theorem.
It has been speculated that quantum physics might reopen the question by allowing some new ways of calculating things that are not possible in classical physics. I think Scott Aaronson has written on the subject, and has proposed that if a theory of physics does allow computability beyond Turing, that in itself is evidence against the theory.
I would disagree, at least I wouldn’t give it much evidence against.
In some sense, this is me being annoyed at how much people think that the principle of normality/Egan’s law universally applies. This is a very clear failure case of the principle of normality, because conditional on hyper computers being buildable, then it doesn’t add up to normality, but to extremes.
No, quantum physics does not allow you to make computers beyond Turing machines.
This. There are scenarios where mathematically speaking, you can construct machines more powerful than Turing machines, but they rely on physics that as far as we know can’t be instantiated.