My knowledge of complexity theory is very limited, but I thought exponentiation could be done in polynomial time. Is that incorrect, or am I misunderstanding Nelson’s argument (please don’t tell me he also doubts complexity theory, we must be allowed some fun :P).
EDIT: Yes, I was wrong, the claim of polynomial time in the length of b may well be too strong, but it does seem to use less than 2^b. I think I can use the position system to compute 2^b in O(b^2) steps, which seems to lead us straight from accepting multiplication to accepting exponentiation.
I think so, in positional notation. That’s what I was alluding to with my O(log log k) comment. But there is no polynomial time algorithm to reduce 2^n to a tally number (obviously).
I once read something by Nelson about the polynomial hierarchy in complexity theory, but I can’t find it now.
My knowledge of complexity theory is very limited, but I thought exponentiation could be done in polynomial time. Is that incorrect, or am I misunderstanding Nelson’s argument (please don’t tell me he also doubts complexity theory, we must be allowed some fun :P).
EDIT: Yes, I was wrong, the claim of polynomial time in the length of b may well be too strong, but it does seem to use less than 2^b. I think I can use the position system to compute 2^b in O(b^2) steps, which seems to lead us straight from accepting multiplication to accepting exponentiation.
I think so, in positional notation. That’s what I was alluding to with my O(log log k) comment. But there is no polynomial time algorithm to reduce 2^n to a tally number (obviously).
I once read something by Nelson about the polynomial hierarchy in complexity theory, but I can’t find it now.