I’m not entirely sure how belief in belief fits in here. The dragon’s unlucky host doesn’t merely believe in belief: as you go out of your way to point out, he has excellent evidence of the creature’s existence and can make predictions based on it. His fatal error is of a different category: rather than adopting a belief for signaling reasons and constructing a model which excuses him from providing empirical evidence for it, he’s constructed a working empirical model and failed to note some of its likely consequences.
An imperfect model of an empirical reality can show fatal gaps when applied to the real world. But that’s not the error of a tithing churchgoer whose concern for his immortal soul disappears in the face of a tempting Tag Heuer watch; it’s the error of a novice pilot who fails to pull out of a tailspin, or of a novice chemist who mistakenly attempts to douse a sodium fire with a water-based foam. A level-one error, in other words, whereas belief in belief would be level zero or off the scale entirely.
The post was inspired by a comment which I felt confused lack of internalization with belief in belief. On reflection, I probably didn’t establish the connection sufficiently.
Yeah, that clarifies some things. Reading over the OP, I note with some embarrassment that you never used the phrase “belief in belief” in the body text—but I also note that Mass_Driver didn’t, either.
“Understanding Your Understanding” does a pretty good job of illustrating the levels of belief, but now I’m starting to think that it might be a good idea to look at the same scale from the perspective of expected error types, not just the warning signs the article already includes.
I’m not entirely sure how belief in belief fits in here. The dragon’s unlucky host doesn’t merely believe in belief: as you go out of your way to point out, he has excellent evidence of the creature’s existence and can make predictions based on it. His fatal error is of a different category: rather than adopting a belief for signaling reasons and constructing a model which excuses him from providing empirical evidence for it, he’s constructed a working empirical model and failed to note some of its likely consequences.
An imperfect model of an empirical reality can show fatal gaps when applied to the real world. But that’s not the error of a tithing churchgoer whose concern for his immortal soul disappears in the face of a tempting Tag Heuer watch; it’s the error of a novice pilot who fails to pull out of a tailspin, or of a novice chemist who mistakenly attempts to douse a sodium fire with a water-based foam. A level-one error, in other words, whereas belief in belief would be level zero or off the scale entirely.
The post was inspired by a comment which I felt confused lack of internalization with belief in belief. On reflection, I probably didn’t establish the connection sufficiently.
Yeah, that clarifies some things. Reading over the OP, I note with some embarrassment that you never used the phrase “belief in belief” in the body text—but I also note that Mass_Driver didn’t, either.
“Understanding Your Understanding” does a pretty good job of illustrating the levels of belief, but now I’m starting to think that it might be a good idea to look at the same scale from the perspective of expected error types, not just the warning signs the article already includes.