But this should show at least how difficult it is for an irrational person to attempt to become more rational; it’s like having to know the rules to play the rules. What does it take to commit to wanting rationality from a beginning of irrationality?
You need some intial luck. It’s like human mind is a self-modifying system, where the rules can change the rules, and again, and again. Thus human mind is floating around in a mindset space. The original setting is rather fluid, for evolutionary reasons—you should be able to join a different tribe if it becomes essential for your survival. On the other hand, the mindset space contains some attractors; if you happen to have some set of rules, these rules keep preserving themselves. Rationality could be one of these attractors.
Is the inability to update one’s mind really so exceptional on LW? One way of not updating is “blah, blah, blah, I don’t listen to you”. This happens a lot everywhere on the internet, but for these people probably LW is not attractive. The more interesting case is “I listen to you, and I value our discussion, but I don’t update”. This seems paradoxical. But I think it’s actually not unusual… the only unusual thing is the naked form—people who refuse to update, and recognize that they refuse to update. The usual form is that people pretend to update… except that their updates don’t fully propagate. In other words, there is no update, only belief in update. Things like: yeah I agree about Singularity and stuff, but somehow I don’t subscribe for cryopreservation; and I agree human lives are valuable and there are charities which can save hundred human lifes for every dollar sent to them, but somehow I didn’t send a single dollar yet; and I agree that rationality is very important and being strategic can increase one’s utility, and then I procrastinate on LW and other web sites and my everyday life goes on without any changes.
We are so irrational that even our attempts to become rational are horribly irrational, and that’s why they often fail.
You need some intial luck. It’s like human mind is a self-modifying system, where the rules can change the rules, and again, and again. Thus human mind is floating around in a mindset space. The original setting is rather fluid, for evolutionary reasons—you should be able to join a different tribe if it becomes essential for your survival. On the other hand, the mindset space contains some attractors; if you happen to have some set of rules, these rules keep preserving themselves. Rationality could be one of these attractors.
Is the inability to update one’s mind really so exceptional on LW? One way of not updating is “blah, blah, blah, I don’t listen to you”. This happens a lot everywhere on the internet, but for these people probably LW is not attractive. The more interesting case is “I listen to you, and I value our discussion, but I don’t update”. This seems paradoxical. But I think it’s actually not unusual… the only unusual thing is the naked form—people who refuse to update, and recognize that they refuse to update. The usual form is that people pretend to update… except that their updates don’t fully propagate. In other words, there is no update, only belief in update. Things like: yeah I agree about Singularity and stuff, but somehow I don’t subscribe for cryopreservation; and I agree human lives are valuable and there are charities which can save hundred human lifes for every dollar sent to them, but somehow I didn’t send a single dollar yet; and I agree that rationality is very important and being strategic can increase one’s utility, and then I procrastinate on LW and other web sites and my everyday life goes on without any changes.
We are so irrational that even our attempts to become rational are horribly irrational, and that’s why they often fail.