Welcome to Less Wrong! Feel free to post an explicit introduction on that thread, if you’re hanging around.
I think the critical point is in the next sentence:
We only get the feeling of certainty, a knowledge of this being right, and that feeling cannot be broken into parts that could be subjected to criticism to see if they add up.
Yes, we don’t know what the interiors are—but the original source of our confidence is our (frequently justified) trust in our intuitions. I think another related point is made in How An Algorithm Feels From Inside, which talks about an experience which is illusory, merely reflecting an artifact of the way the brain processes data. The brain usually doesn’t bother flagging a result as a result, it just marks it as true and charges forward. And as a consequence we don’t observe that we are generalizing from the pattern of news stories we watched, and therefore don’t realize our generalization may be wrong.
Welcome to Less Wrong! Feel free to post an explicit introduction on that thread, if you’re hanging around.
I think the critical point is in the next sentence:
Yes, we don’t know what the interiors are—but the original source of our confidence is our (frequently justified) trust in our intuitions. I think another related point is made in How An Algorithm Feels From Inside, which talks about an experience which is illusory, merely reflecting an artifact of the way the brain processes data. The brain usually doesn’t bother flagging a result as a result, it just marks it as true and charges forward. And as a consequence we don’t observe that we are generalizing from the pattern of news stories we watched, and therefore don’t realize our generalization may be wrong.