Which is why I call it rational irrationality, or rationally irrational if you would prefer.
I do think it is possible to semantically stretch the conception of rationality to cover this, but I still think a fundamental distinction needs to be acknowledged between rationality that leads to taking control in a situation, and rationality that leads to intentional inaction.
I feel like you are conflating terminal values (goals) and instrumental values (means/effectiveness) a little bit here. There’s really no good reason to adopt an instrumental value that doesn’t help you achieve your goals. But if you aren’t sure of what your goals are, then no amount of improvement of your instrumental values will help.
I’m trying to distinguish between the circumstance where you aren’t sure if inactivity will help achieve what you want (if you want your spouse to complete a chore, should you remind them or not?) or aren’t sure if inactivity is what you want (do I really like meditation or not?).
In particular, your worry about accuracy of maps and whether you should act on them or check on them seems to fundamentally be a problem about goal uncertainty. Some miscommunication is occurring because the analogy is focused on instrumental values. To push a little further on the metaphor, a bad map will cause you to end up in Venice instead of Rome, but improving the map won’t help you decide if you want to be in Rome.
Rationality is a tool for making choices. Sometimes the rational choice is not to play.
Which is why I call it rational irrationality, or rationally irrational if you would prefer. I do think it is possible to semantically stretch the conception of rationality to cover this, but I still think a fundamental distinction needs to be acknowledged between rationality that leads to taking control in a situation, and rationality that leads to intentional inaction.
I feel like you are conflating terminal values (goals) and instrumental values (means/effectiveness) a little bit here. There’s really no good reason to adopt an instrumental value that doesn’t help you achieve your goals. But if you aren’t sure of what your goals are, then no amount of improvement of your instrumental values will help.
I’m trying to distinguish between the circumstance where you aren’t sure if inactivity will help achieve what you want (if you want your spouse to complete a chore, should you remind them or not?) or aren’t sure if inactivity is what you want (do I really like meditation or not?).
In particular, your worry about accuracy of maps and whether you should act on them or check on them seems to fundamentally be a problem about goal uncertainty. Some miscommunication is occurring because the analogy is focused on instrumental values. To push a little further on the metaphor, a bad map will cause you to end up in Venice instead of Rome, but improving the map won’t help you decide if you want to be in Rome.