I am pessimistic about the possibilities for preventing this
Why do you think it’s tougher to tackle than other human biases?
Personally, I think I’ve found reading things like the identity essay you linked to and the detachment essay I linked to useful… just like most of the other biases we talk about on LW, it feels (from the inside, at least) like reading the right stuff can be helpful.
Why do you think it’s tougher to tackle than other human biases?
The default reaction my brain (and so, generalizing from one example, I expect other brains on LW too) has to this whole program of overcoming biases is “oh cool! I’m the kind of person who overcomes biases now! That makes me so much better than all those other biased people.” Successfully overcoming other biases actually makes this one a little worse. But I think I’m okay with this one as long as I’m using it to motivate myself.
“Rationalist” or “someone who overcomes biases” seems to be a smaller identity than “part of LW” (as the latter tends to be construed, by SamLL for example, as including beliefs that are merely popular around here instead of having to do with rationality). If we can’t (or it would be counterproductive to) avoid adopting the former identity, it should still be possible (or sensible) to avoid adopting the latter identity.
Why do you think it’s tougher to tackle than other human biases?
Personally, I think I’ve found reading things like the identity essay you linked to and the detachment essay I linked to useful… just like most of the other biases we talk about on LW, it feels (from the inside, at least) like reading the right stuff can be helpful.
The default reaction my brain (and so, generalizing from one example, I expect other brains on LW too) has to this whole program of overcoming biases is “oh cool! I’m the kind of person who overcomes biases now! That makes me so much better than all those other biased people.” Successfully overcoming other biases actually makes this one a little worse. But I think I’m okay with this one as long as I’m using it to motivate myself.
“Rationalist” or “someone who overcomes biases” seems to be a smaller identity than “part of LW” (as the latter tends to be construed, by SamLL for example, as including beliefs that are merely popular around here instead of having to do with rationality). If we can’t (or it would be counterproductive to) avoid adopting the former identity, it should still be possible (or sensible) to avoid adopting the latter identity.
So maybe this should be the last bias one overcomes then :)
In any case, this particular link doesn’t deal with identity itself, just a potential negative side effect.