Yes, anonymous, that’s exactly what I was getting at. Eliezer_Yudkowsky’s claim about a mind acquiring knowledge after first starting with nothing could only be true if we viewed the evolutionary history as the “mind”. My caution here is against thinking that one understands the human brain because he has inferred that after conception, that human must have observed evidence on which he performed Bayesian inference (which could somehow be captured in an AI). In reality, this need not be the case at all—that new human, upon growing, could simply have been fed accurate knowledge about the world, gathered through that evolution history, which coincidentally matches the world, even though he didn’t gain it through any Bayesian inference.
Yes, anonymous, that’s exactly what I was getting at. Eliezer_Yudkowsky’s claim about a mind acquiring knowledge after first starting with nothing could only be true if we viewed the evolutionary history as the “mind”. My caution here is against thinking that one understands the human brain because he has inferred that after conception, that human must have observed evidence on which he performed Bayesian inference (which could somehow be captured in an AI). In reality, this need not be the case at all—that new human, upon growing, could simply have been fed accurate knowledge about the world, gathered through that evolution history, which coincidentally matches the world, even though he didn’t gain it through any Bayesian inference.
So, again, am I too far out on a limb here?