Yes, I considered that ambiguity, and certainly you couldn’t send him a survey. But it gives a lower bound, and Turing does talk about machines equaling or exceeding human capacities across the board.
Hm. Would it be justifiable to extrapolate Turing’s predictions? Because we know that he was off by at least a decade on just the AI; presumably any Singularity would be have to be that much or more.
Yes, I considered that ambiguity, and certainly you couldn’t send him a survey. But it gives a lower bound, and Turing does talk about machines equaling or exceeding human capacities across the board.
Hm. Would it be justifiable to extrapolate Turing’s predictions? Because we know that he was off by at least a decade on just the AI; presumably any Singularity would be have to be that much or more.