So: I am not too worried about the universe being uncomputable.
On the race to superintelligence, there are more pressing things to worry about than such possibilities—and those interested in winning that race should prioritise their efforts—with things like this being at the bottom of the heap—otherwise they are more likely to fail.
I don’t think that Solomonoff induction has a problem in this area—but it is a plauisble explanation of what the reference to “higher-order logic” referred to.
So: I am not too worried about the universe being uncomputable.
On the race to superintelligence, there are more pressing things to worry about than such possibilities—and those interested in winning that race should prioritise their efforts—with things like this being at the bottom of the heap—otherwise they are more likely to fail.
I don’t think that Solomonoff induction has a problem in this area—but it is a plauisble explanation of what the reference to “higher-order logic” referred to.