Rationality doesn’t tell you what you goals are, and martial arts don’t tell you which people to defeat.
It does, surprisingly. If you don’t know what your goals are, there are worse and better ways of figuring that out, with errors on this level having pronounced if subtly hard-to-notice consequences. There is probably even a sense in which it’s impossible to know your goals (or their definition, etc.) exactly, to reach a point where you are allowed to stop creatively working on the question.
I agree, that it rationality should help you figure out your instrumental goals, but it’s easy to view this as ‘a way to better achieve your higher level goals’.
Not just instrumental goals. If you believe that you should achieve something, it doesn’t automatically mean that you really should. Your belief is a fact about your brain, which is not always in good alignment with your values (even though it really tries).
When you notice that you want something (as a terminal goal), you are reflecting on the fact that your brain, probably the best value-estimating apparatus you’ve got, has calculated that pursuing this goal is good. It could be wrong, it’s your job now to figure out if it made an error in that judgment. Maybe you can find a way to improve on its reasoning process, compensating for a specific flaw and thus gaining access to a superior conclusion produced by the improved procedure (which is often ultimately the point of knowing how things work). (Or maybe you’ll even find an argument that makes taking into account what your own brain tells you in a given instance a bad idea.)
It does, surprisingly. If you don’t know what your goals are, there are worse and better ways of figuring that out, with errors on this level having pronounced if subtly hard-to-notice consequences. There is probably even a sense in which it’s impossible to know your goals (or their definition, etc.) exactly, to reach a point where you are allowed to stop creatively working on the question.
I agree, that it rationality should help you figure out your instrumental goals, but it’s easy to view this as ‘a way to better achieve your higher level goals’.
Not just instrumental goals. If you believe that you should achieve something, it doesn’t automatically mean that you really should. Your belief is a fact about your brain, which is not always in good alignment with your values (even though it really tries).
When you notice that you want something (as a terminal goal), you are reflecting on the fact that your brain, probably the best value-estimating apparatus you’ve got, has calculated that pursuing this goal is good. It could be wrong, it’s your job now to figure out if it made an error in that judgment. Maybe you can find a way to improve on its reasoning process, compensating for a specific flaw and thus gaining access to a superior conclusion produced by the improved procedure (which is often ultimately the point of knowing how things work). (Or maybe you’ll even find an argument that makes taking into account what your own brain tells you in a given instance a bad idea.)
But where do values reside? How do you know that your belief did not correspond to your values?
Where does truth about arithmetic reside? How can you ever find out that you’ve miscalculated something? Apply similar principles to moral questions.