Nice write-up! I’m glad someone brought up this idea.
Here’s my take on this:
The human mind is an engine of cognition. Evolutionarily speaking, the engine is optimized for producing correct motor-outputs. Whether its internal state is epistemically true or not does not matter (to evolution), expect insofar that affects present and future motor-outputs.
The engine of cognition is made of bias/heuristics/parts that reason in locally invalid ways. Validity is a property of the system as a whole: the local errors/delusions (partially) cancel out. Think something like SSC’s Apologist and Revolutionary: one system comes up with ideas (without checking if they are reasonable or possible), one criticises them (without checking if the criticism is fair). Both are “delusional” on their own, but the combined effect of both is something approaching sanity.
One can attempt to “weaponize” the bias to improve the speed/efficiency of cognition. However, this can cause dangerous cognitive instability, as many false beliefs are self-reinforcing: the more you believe it the harder it is to unbelieve it. A bias that reinforces itself. And once the cognitive engine has gone outside its stability envelope, there is no turning back: the person who fell prey to the bias is unlikely to change their mind until they crash hard into reality, and possibly not even then (think pyramid schemes, cults, the Jonestown massacre, etc).
Nice write-up! I’m glad someone brought up this idea.
Here’s my take on this:
The human mind is an engine of cognition. Evolutionarily speaking, the engine is optimized for producing correct motor-outputs. Whether its internal state is epistemically true or not does not matter (to evolution), expect insofar that affects present and future motor-outputs.
The engine of cognition is made of bias/heuristics/parts that reason in locally invalid ways. Validity is a property of the system as a whole: the local errors/delusions (partially) cancel out. Think something like SSC’s Apologist and Revolutionary: one system comes up with ideas (without checking if they are reasonable or possible), one criticises them (without checking if the criticism is fair). Both are “delusional” on their own, but the combined effect of both is something approaching sanity.
One can attempt to “weaponize” the bias to improve the speed/efficiency of cognition. However, this can cause dangerous cognitive instability, as many false beliefs are self-reinforcing: the more you believe it the harder it is to unbelieve it. A bias that reinforces itself. And once the cognitive engine has gone outside its stability envelope, there is no turning back: the person who fell prey to the bias is unlikely to change their mind until they crash hard into reality, and possibly not even then (think pyramid schemes, cults, the Jonestown massacre, etc).