I suspect you aren’t sufficiently taking into account the magnitude of people’s irrationality and the non-monotonicity of rationality’s rewards. I agree that intelligence enhancement would have greater overall effects than rationality enhancement, but rationality’s effects will be more careful and targeted—and therefore more likely to work as existential risk mitigation.
I agree that a world where everyone had good critical thinking skills would be much safer. But getting there is super-tough. Learning is something most people HATE. Rationality—especially stuff involving probability theory, logic, statistics and some basic evolutionary science—requires IQ 100 as a basic prerequisite in my estimation.
I will discuss the ways we could get to a rational world, but this post is about merely a more intelligent world.
This was covered in some LW posts a while ago (which I cannot be arsed to look up and link); the paradigmatic example in those posts, I think, was a LWer who used to be a theist and have a theist girlfriend, but reading OB/LW stuff convinced him of the irrationality of God. Then his girlfriend left his hell-bound hide for greener pastures, and his life is in general poorer than when he started reading OB/LW and striving to be more rational.
The suggestion is that rationality/irrationality is like a U: you can be well-off as a bible-thumper, and well-off as a stone-cold Bayesian atheist, but the middle is unhappy.
Increases in rationality can sometimes lead with some regularity to decreasing knowledge or utility (hopefully only temporarily and in limited domains).
I suspect you aren’t sufficiently taking into account the magnitude of people’s irrationality and the non-monotonicity of rationality’s rewards. I agree that intelligence enhancement would have greater overall effects than rationality enhancement, but rationality’s effects will be more careful and targeted—and therefore more likely to work as existential risk mitigation.
I agree that a world where everyone had good critical thinking skills would be much safer. But getting there is super-tough. Learning is something most people HATE. Rationality—especially stuff involving probability theory, logic, statistics and some basic evolutionary science—requires IQ 100 as a basic prerequisite in my estimation.
I will discuss the ways we could get to a rational world, but this post is about merely a more intelligent world.
Could you elaborate on the shape of the rewards to rationality?
This was covered in some LW posts a while ago (which I cannot be arsed to look up and link); the paradigmatic example in those posts, I think, was a LWer who used to be a theist and have a theist girlfriend, but reading OB/LW stuff convinced him of the irrationality of God. Then his girlfriend left his hell-bound hide for greener pastures, and his life is in general poorer than when he started reading OB/LW and striving to be more rational.
The suggestion is that rationality/irrationality is like a U: you can be well-off as a bible-thumper, and well-off as a stone-cold Bayesian atheist, but the middle is unhappy.
I’m not sure this is a fair statement. He did say he wouldn’t go back if he had the choice.
Increases in rationality can sometimes lead with some regularity to decreasing knowledge or utility (hopefully only temporarily and in limited domains).