Contradiction Appeal Bias
Contradiction Appeal Bias, or Counterintuitivity Bias, is the tendency to believe statements that contradict common knowledge simply because they’re unexpected. Think conspiracy theories, health myths and so on.
Causes
Several factors drive this bias, including but not limited to:
-
Self-Doubt: Individuals unsure of their own understanding may be more easily swayed by contradictory information.
-
Emotion: Alarming claims, especially those that trigger fear, can make people believe without much evidence.
-
Information Overload: When bombarded with information, counterintuitive claims might be accepted without scrutiny.
Examples
-
Flat Earth Theory: Despite the scientific consensus on a spherical Earth, some are drawn to the flat Earth idea because it starkly contradicts mainstream thought. Though I do not purport that this idea alone explains the existence of conspiracy theories.
-
Health Myths: Claims like “drinking cold water after meals causes cancer” can spread rapidly, not due to scientific evidence, but because they challenge common practices and beliefs.
This is a rough first sketch. Looking for quick feedback on the idea, will update in the future.
Some factors I’ve noticed that increase the likelihood some fringe conspiracy theory is believed:
Apparent Unfalsifiability: Nothing a lay person could do within their immediate means or without insider knowledge or scientific equipment could disprove the theory. The mainstream truth has to be taken on trust in powerful institutions. Works with stochastic/long term health claims or claims or some hidden agenda perpetrated by a secret cabal.
Complexity Reduction: The claim takes some highly nuanced, multifaceted difficult domain and simplifies its cause to one as simple as its effects. This creates a more clear model of the world in the mind of whoever accepts the claim.
Projection of Intent: The claim takes some effect/outcome in the world that is the natural consequence of various competing factors/natural network effects and reduces it to the deliberate outcome of a specific group who intended the outcome to happen. This is somewhat comforting, even if it describes an ominous specter of an evil, secret government agency, because it turns something scary in the world from a mysterious, unmanageable existential threat to something attributable to real people who can theoretically be stopped.
Promise of Control: The claim offers those who hear it some path to resolving it through knowledge not known to the mainstream, and suggests that something that common knowledge would claim is somewhat random/uncontrollable to be within control and solvable.
Promise of Absolution: The claim, if true, will justify any bad things a group the listener may align with in certain respects that make it normally untenable as a moral authority. This is why it is useful to claim a political group’s enemies are vampiric world-destroying pedophiles, because if true, any evil committed by the group are not as bad in comparison, and thus their position as an opponent of the evil group vindicates and absolves all the ostensible evil stuff they’ve done that can’t be denied.
Rapturous Timeline: The claim presents a timeline wherein the “unworthy” majority will suffer a great negative outcome, and those who buy into the merits of the claim, the minority “in the know” will be granted a golden ticket to a better world realized after some great event occurs in the future.
All these elements don’t make any claim more convincing to the average person, but to certain groups they provide clear psychological incentives that would provide benefits to believing even outlandish claims.
Yet more:
Status grabs through having arcane knowledge unknown to the sheeple.
Resentment and excuses: you’re not a failure, because They control everything, so you never had a chance.