First of all, the reason for my spirited defense of MH’s statement is that looked like a good theory because of how concise it was, and how consistent with my knowledge of programs it was. So, I upped my prior on it and tended to see apparent failures of it as a sign I’m not applying it correctly, and that further analysis could yield a useful insight.
And I think I that belief is turning out to be true:
It seems to specify that the output is what is unknown—not the sensations that output generates in any particular observer.
But the sensations are a property of the output. In a trivial sense: it is a fact about the output, that a human will perceive it in a certain way.
And in a deeper sense, the numeral “9” means “that that someone will perceive as symbol representing the number nine in the standard number system”. I’m reminded of Douglas Hofstadter’s claim that definition of individual letters is an AI-complete problem because you must know a wealth of information about the cognitive system to be able to identify the full set of symbols someone will recognize as e.g. an “A”.
This yields the counterintuitive result that, for certain programs, you must reference the human cognitive system (or some concept isomorphic thereto) in listing all the facts about the output. That result must hold for any program whose output will eventually establish mutual information with your brain.
Very worthwhile points, Tim_Tyler.
First of all, the reason for my spirited defense of MH’s statement is that looked like a good theory because of how concise it was, and how consistent with my knowledge of programs it was. So, I upped my prior on it and tended to see apparent failures of it as a sign I’m not applying it correctly, and that further analysis could yield a useful insight.
And I think I that belief is turning out to be true:
It seems to specify that the output is what is unknown—not the sensations that output generates in any particular observer.
But the sensations are a property of the output. In a trivial sense: it is a fact about the output, that a human will perceive it in a certain way.
And in a deeper sense, the numeral “9” means “that that someone will perceive as symbol representing the number nine in the standard number system”. I’m reminded of Douglas Hofstadter’s claim that definition of individual letters is an AI-complete problem because you must know a wealth of information about the cognitive system to be able to identify the full set of symbols someone will recognize as e.g. an “A”.
This yields the counterintuitive result that, for certain programs, you must reference the human cognitive system (or some concept isomorphic thereto) in listing all the facts about the output. That result must hold for any program whose output will eventually establish mutual information with your brain.
Am I way off the deep end here? :-/