Some beliefs seem to naively imply radical and dangerous actions. But there often are rational reasons to not act on those beliefs. Knowing those reasons is really important for those that don’t have a natural defense mechanism.
Most people have a natural defense mechanism which is to not taking ideas seriously. If you just follow what others do, it’s less likely that an error in your explicit reasoning will lead you to doing something radical and dangerous. The more likely you are to make such errors, the most (evolutionary and individually) advantageous it is for you to have a conformist instinct.
The answer to this question is mostly meant for people to which I want to share ideas that are dangerous if taken at face value / object-level (I want to make sure they have those defense mechanisms first; I encourage you do the same, and do your due diligences when discussing dangerous ideas; this post is not sufficient). I want to advocate to smart people to take ideas more seriously, but I don’t want them to fully repress their conformist instincts, especially if they haven’t built in explicit defense mechanisms. This post should also be useful for people already not having those defense mechanism. And also useful to people that want to better understand the function of conformity (although conformity is not the only defense mechanism).
Note that the defense mechanisms are not meant as fully general counterargument. They are not insurmontable (at least, not always), they just indicate when it’s prudent to want more evidence.
As a small tangent, is also often has a positive externality to do exploration:
Like it’s rational for any individual to be pursuing much more heavily exploitation based strategy as long as someone somewhere else is creating the information and part of what I find kind of charming and counterintuitive about this is that you realize people who are very exploratory by nature are performing a public service. (source: Computersciencealgorithmstacklefundamentalanduniversalproblems.Cantheyhelpuslivebetter,oristhatafalse hope?)
I will post my answer below.
Model uncertainty
Even if your model says there’s a high probability of X, it doesn’t mean X is very likely. You also need to take into account the probability that the model itself is right. See: When the uncertainty about the model is higher than the uncertainty in the model For example, you could ask yourself: what’s the probability that I could read something that changed my mind about the validity of this model?
Beliefs vs impressions
Even if you have the impression that X is true, it might still be prudent to believe that maybe ~X if:
a lot of people you (otherwise) trust epistemologically disagree
a lot of our thinking on this seems still confused
it seems like we’re still making progress on the topic
it seems likely that there’s a lot of unknown unknowns
this type of question has a poor track record at being tackled accurately
you have been wrong with similar beliefs in the past
etc.
See: Beliefs vs impressions
Option value
Option value is generally useful; it’s a convergent instrumental goal. Even if you are confident about some model of the world or moral framework, it might still remain a priority to keep your options open just in case your wrong. See Hard-to-reverse decisions destroy option value.
Group rationality
Promoting a norm of taking actions even when they are based on a model of the world few people share seems bad. See Unilateralist’s curse
This depends a lot on the actions (and the world). If few people think cryonics will work out, but you do it anyway...oh no! You might live longer!
Some actions don’t seem dangerous.
If few people think the spread of disease can be reduced by washing hands, but you’re in charge of the hospital, and you make people do it, and see that the rate of patients dying (or getting sick) drops enormously, then why not continue? Why not roll it out to the rest of the hospital?*
Empiricism may march ahead of consensus, especially when there’s a lot of very low hanging fruit.*
*Historically this once included ‘wash your hands after conducting an autopsy, and especially before doing anything to help out with a pregnancy’.
Ah, yes absolutely! I should have specified my original claim further to something like “when it affects the whole world” and “a lot of people you’ve identified as rational disagree”.