Beliefs which cannot be updated aren’t useful, but not all beliefs which might reasonably form a “worldview” are un-Bayesian. Maybe a better way to talk about worldviews is to think about beliefs which are highly depended upon; beliefs which, if they were updated, would also cause huge re-updates of lots of beliefs farther down the dependency graph.
Yes.
Beliefs have hierarchy, and some are more top-level than others. One of the most top-level beliefs being:
a vast superintelligence exists
it has created/effected/influenced our history
If you give high weight to 1, then 2 follows and is strengthened, and this naturally guides your search for explanations for mysteries. A top-level belief sends down a massive cascade of priors that can effect how you interpret everything else.
If you hold the negation of 1 and or 2 as top-level beliefs then you look for natural explanations for everything. Arguably the negation of ‘goddidit’ as a top-level belief was a major boon to science because it tends to align with ockham’s razor.
But at the end of the day it’s not inherently irrational to hold these top-level beliefs. Francis Crick for instance looked at the origin of life problem and decided an unnatural explanation involving a superintelligence (alien) was actually a better fit.
A worldview comes into play when one jumps to #3 with Miller-Urey because it fits with one’s top-level priors. Our brain is built around hierarchical induction, so we always have top-level biases. This isn’t really an inherent weakness as there probably is no better (more efficient) way to do it. But it is still something to be aware of.
But at the end of the day it’s not inherently irrational to hold these top-level beliefs. Francis Crick for instance...
But, I don’t think Crick was talking about a “vast superintelligence”. In his paper, he talks about extraterrestrials sending out unmanned long-range spacecraft, not anything requiring what I think he or you would call superintelligence. In fact, he predicted that we would have that technology within “a few decades”, though rocket science isn’t among his many fields of expertise so I take that with a grain of salt.
A worldview comes into play when one jumps to #3 with Miller-Urey because it fits with one’s top-level priors.
I don’t think that’s quite what happened to me, though; the issue was that it didn’t fit my top-level priors. The solution wasn’t to adjust my worldview belief but to apply it more rationally; I ran into an akrasia problem and concluded #3 because I hadn’t examined my evidence well enough according to even my own standards.
Yes.
Beliefs have hierarchy, and some are more top-level than others. One of the most top-level beliefs being:
a vast superintelligence exists
it has created/effected/influenced our history
If you give high weight to 1, then 2 follows and is strengthened, and this naturally guides your search for explanations for mysteries. A top-level belief sends down a massive cascade of priors that can effect how you interpret everything else.
If you hold the negation of 1 and or 2 as top-level beliefs then you look for natural explanations for everything. Arguably the negation of ‘goddidit’ as a top-level belief was a major boon to science because it tends to align with ockham’s razor.
But at the end of the day it’s not inherently irrational to hold these top-level beliefs. Francis Crick for instance looked at the origin of life problem and decided an unnatural explanation involving a superintelligence (alien) was actually a better fit.
A worldview comes into play when one jumps to #3 with Miller-Urey because it fits with one’s top-level priors. Our brain is built around hierarchical induction, so we always have top-level biases. This isn’t really an inherent weakness as there probably is no better (more efficient) way to do it. But it is still something to be aware of.
But, I don’t think Crick was talking about a “vast superintelligence”. In his paper, he talks about extraterrestrials sending out unmanned long-range spacecraft, not anything requiring what I think he or you would call superintelligence. In fact, he predicted that we would have that technology within “a few decades”, though rocket science isn’t among his many fields of expertise so I take that with a grain of salt.
I don’t think that’s quite what happened to me, though; the issue was that it didn’t fit my top-level priors. The solution wasn’t to adjust my worldview belief but to apply it more rationally; I ran into an akrasia problem and concluded #3 because I hadn’t examined my evidence well enough according to even my own standards.