Good point, I often find myself torn between epistemic rationality as a terminal value and its alternative. My thoughts are learning how to treat truth as the highest goal would be more useful to my career in physics and would be better for the world then if I currently steered to my closer less important values.
So treating truth as the highest goal serves your other, even higher goals?
What behaviors are encapsulated by the statement that you’re treating truth as the highest goal, and why can’t you just execute those behaviors anyway?
Good point, I often find myself torn between epistemic rationality as a terminal value and its alternative. My thoughts are learning how to treat truth as the highest goal would be more useful to my career in physics and would be better for the world then if I currently steered to my closer less important values.
So treating truth as the highest goal serves your other, even higher goals?
What behaviors are encapsulated by the statement that you’re treating truth as the highest goal, and why can’t you just execute those behaviors anyway?
Truth is the highest goal; being ‘right’ is a lower goal; improving a career in physics is the lowest goal.
Seeking truth is opposed to being ‘right’, but aligned with the career in physics.
If you’re justifying a terminal value, it’s not your real terminal value.
(Doesn’t strictly follow. People can and do justify things that they don’t need to justify all the time. It does imply confusion.)