Belief Bias: Bias in Evaluating AGI X-Risks
Where the evaluation of the logical strength of an argument is biased by the believably of the conclusion.
Insofar as rationality, and science in itself, requires a certain suspension of prejudgement, it is also the case that the heuristics associated with our hard won experiential intuitions regarding various matters is a significant and important optimization. That is, with respect to working on things that actually matter, avoiding needless distractions and detours, identifying worthwhile observations, etc.
The difficulty is that we want to apply our intuition too often, particularly because it is generally much faster/easier than actually doing/implementing analytic work. Furthermore, when something seems to disagree or invalidate our intuition, there is a strong motivation to prevent that outcome insofar as such invalidation of intuition would imply that we are allowed to use the ‘fast/easy’ tool less often than we had previously assumed.
As such, arguments which produce results contrary to one’s own intuition about what “should” or “is expected” be the case are also implicitly viewed as somewhat disabling and invalidating of ones own expertise – particularly if there also is some self-identification as an ‘expert’. No one wants to give up cherished notions regarding themselves.
The net effect is that arguments perceived as ‘challenging’ will be challenged (criticized) somewhat more fully and aggressively than rationality and the methods of science would have already called for.
- link Wikipedia: Belief bias
- an item on Forrest Landry’s compiled list of biases in evaluating extinction risks.
no objections to this one besides “I only believe you because I already knew that”. feels like a summary of sequences content. maybe deserves to be a topic page description rather than a post?