Well if I see the screen then there’s an encoding of “28” in my brain. Not of the reason why 28 is true, but at least that the answer is “28″.
You believe that “the computer contains a copy of Understand”, not “the computer contains a book with the following text: [text of Understand]”.
Obviously, on the level of detail in which the notion of “belief” starts breaking down, the notion of “belief” starts breaking down.
But still, it remains; When we say that I know a fact, the statement of my fact is encoded in my brain. Not the referent, not an argument for that statement, just: a statement.
Yet you might not know the question. “28” only certifies that the question makes a true statement.
You believe that “the computer contains a copy of Understand”, not “the computer contains a book with the following text: [text of Understand]”.
Exactly. You don’t know [text of Understand], yet you can reason about it, and use it in your designs. You can copy it elsewhere, and you’ll know that it’s the same thing somewhere else, all without having an explicit or any definition of the text, only diverse intuitions describing its various aspects and tools for performing operations on it. You can get an md5 sum of the text, for example, and make a decision depending on its value, and you can rely on the fact that this is an md5 sum of exactly the text of “Understand” and nothing else, even though you don’t know what the text of “Understand” is.
But still, it remains; When we say that I know a fact, the statement of my fact is encoded in my brain. Not the referent, not an argument for that statement, just: a statement.
This sort of deep wisdom needs to be the enemy (it strikes me often enough). Acts as curiosity-stopper, covering the difficulty in understanding things more accurately. (What’s “just a statement”?)
This sort of deep wisdom needs to be the enemy (it strikes me often enough). Acts as curiosity-stopper, covering the difficulty in understanding things more accurately. (What’s “just a statement”?)
In certain AI designs, this problem is trivial. In humans, this problem is not simple.
The complexities of the human version of this problem do not have relevance to anything in this overarching discussion (that I am aware of).
But still, it remains; When we say that I know a fact, the statement of my fact is encoded in my brain. Not the referent, not an argument for that statement, just: a statemen
Well if I see the screen then there’s an encoding of “28” in my brain. Not of the reason why 28 is true, but at least that the answer is “28″.
You believe that “the computer contains a copy of Understand”, not “the computer contains a book with the following text: [text of Understand]”.
Obviously, on the level of detail in which the notion of “belief” starts breaking down, the notion of “belief” starts breaking down.
But still, it remains; When we say that I know a fact, the statement of my fact is encoded in my brain. Not the referent, not an argument for that statement, just: a statement.
Yet you might not know the question. “28” only certifies that the question makes a true statement.
Exactly. You don’t know [text of Understand], yet you can reason about it, and use it in your designs. You can copy it elsewhere, and you’ll know that it’s the same thing somewhere else, all without having an explicit or any definition of the text, only diverse intuitions describing its various aspects and tools for performing operations on it. You can get an md5 sum of the text, for example, and make a decision depending on its value, and you can rely on the fact that this is an md5 sum of exactly the text of “Understand” and nothing else, even though you don’t know what the text of “Understand” is.
This sort of deep wisdom needs to be the enemy (it strikes me often enough). Acts as curiosity-stopper, covering the difficulty in understanding things more accurately. (What’s “just a statement”?)
In certain AI designs, this problem is trivial. In humans, this problem is not simple.
The complexities of the human version of this problem do not have relevance to anything in this overarching discussion (that I am aware of).
So you say. Many would say that you need the argument (proof, justification, evidence) for a true belief for it to qualify as knowledge.
Obviously, this doesn’t prevent me from saying that I know something without an argument.
You can say that you are the Queen of Sheba.
It remains the case that knowledge is not lucky guessing, so an argument, evidence or some other justification is required.
Yes, but this is completely and totally irrelevant to the point I was making, that:
I will profess that a statement, X, is true, if and only if “X” is encoded in a certain manner in my brain.
Yet “X is true” does not mean “X is encoded in this manner in my brain.”