Short answer: It won’t guarantee that, because rats learn most of what they know. The equation I developed turns out to be identical to an equation saying that the amount of information contained in facts and data must be at least as great as the amount of information that it takes to specify the ontology. So any creature that learns its ontology, automatically satisfies the equation.
I don’t understand the question. It’s an inequality, and in cases where the inequality isn’t satisfied, the answer it gives is “I don’t know”. The answer for a rat will always be “I don’t know”.
So… uh… how can I use this to see if, say, rats are guaranteed to not be conscious?
Short answer: It won’t guarantee that, because rats learn most of what they know. The equation I developed turns out to be identical to an equation saying that the amount of information contained in facts and data must be at least as great as the amount of information that it takes to specify the ontology. So any creature that learns its ontology, automatically satisfies the equation.
… Could we take as the input the most a rat could ever learn?
I don’t understand the question. It’s an inequality, and in cases where the inequality isn’t satisfied, the answer it gives is “I don’t know”. The answer for a rat will always be “I don’t know”.
I must profess I didn’t understand most of what you’ve said, but did I guess the following right? The equation says that
IF my knowledge is “bigger” than my ontology THEN I might be conscious
And in the case of learning my ontology, it means that my ontology is a subset of my knowledge and thus never bigger than the former.
Right.
Exactly.
That’s like watching the Wright Brothers fly their airplane at Kitty Hawk, then asking how to fly to London.
If the numbers came up to say that rats don’t need to be considered conscious, I would think the numbers were probably wrong.