Yes, I wouldn’t have bothered if he had said something like that; the thing is from the above that didn’t seem to be the objection he was making. Since he now says it essentially is, I think I’ll step out of this argument. (Well, the first two sentences are easily answerable, but I’ll let someone else do that if they really want.) Also apparently by “intuition is required”, he means “brains cannot carry out an algorithm 100% reliably, and 100% reliability is required (or something like that)”. Which would I suppose make him the first person I’ve heard to actually (effectively) endorse “ordinary person reasoning”, where only chains of reasoning of a bounded (and very short!) length are valid! (I seem to recall this being discussed somewhere here before… can’t find it right now, though.) Anyway, I won’t bother commenting on this any further.
Yes, I wouldn’t have bothered if he had said something like that; the thing is from the above that didn’t seem to be the objection he was making. Since he now says it essentially is, I think I’ll step out of this argument. (Well, the first two sentences are easily answerable, but I’ll let someone else do that if they really want.) Also apparently by “intuition is required”, he means “brains cannot carry out an algorithm 100% reliably, and 100% reliability is required (or something like that)”. Which would I suppose make him the first person I’ve heard to actually (effectively) endorse “ordinary person reasoning”, where only chains of reasoning of a bounded (and very short!) length are valid! (I seem to recall this being discussed somewhere here before… can’t find it right now, though.) Anyway, I won’t bother commenting on this any further.
Oh, I found it. It wasn’t a discussion here, it was a post on Scott Aaronson’s blog: http://www.scottaaronson.com/blog/?p=232