Very interesting. I can’t help feeling that “trying to be a better rationalist” is somehow a paradoxical aim.
Roughly speaking I would say that we have preferences, and their is no rational way of picking preferences. If you prefer pizza to icecream, or pleasure to pain, or living to dying, then that is that. Rationality is a mechanism for effectively seeking your preferences, ordering pizza, not putting your had in a fire etc. You can’t pick rational preferences (goals), you can pick a rational route towards those goals.
If you adopt “I want to be more rational” as a preference/goal in-itself it feels like the snake is eating its own tail.
Maybe “meta goals” like this do arise elsewhere, eg. “I don’t currently have any interest in being strong/rich/powerful/skilled for its own sake, and nor are these things worth pursuing based on my current preferences (which are more efficiently achieved else-ways). However, these are things that might be generically useful for achieving preferences I may or may not have in the future, so I should acquire them as tools for later”.
But if we take rationality to mean “taking the best actions with the available information to meet your goals”, then, at least by this definition, pursuing the meta-goals appears to be definitionally irrational. This extends to the meta-goal of “being a better rationalist”.
Very interesting. I can’t help feeling that “trying to be a better rationalist” is somehow a paradoxical aim.
Roughly speaking I would say that we have preferences, and their is no rational way of picking preferences. If you prefer pizza to icecream, or pleasure to pain, or living to dying, then that is that. Rationality is a mechanism for effectively seeking your preferences, ordering pizza, not putting your had in a fire etc. You can’t pick rational preferences (goals), you can pick a rational route towards those goals.
If you adopt “I want to be more rational” as a preference/goal in-itself it feels like the snake is eating its own tail.
Maybe “meta goals” like this do arise elsewhere, eg. “I don’t currently have any interest in being strong/rich/powerful/skilled for its own sake, and nor are these things worth pursuing based on my current preferences (which are more efficiently achieved else-ways). However, these are things that might be generically useful for achieving preferences I may or may not have in the future, so I should acquire them as tools for later”.
But if we take rationality to mean “taking the best actions with the available information to meet your goals”, then, at least by this definition, pursuing the meta-goals appears to be definitionally irrational. This extends to the meta-goal of “being a better rationalist”.