If something cannot at least in theory be tested by experiment then it has no effect on the world and lacks meaning from a truth stand point therefore rational standpoint.
Better version: …then it has no effect on the world and therefore is not useful to have information about.
As to the rest of your post, I will make a general observation: you are speaking as if epistemic rationality is a terminal value. There’s nothing wrong with that (insofar as nobody can say someone else’s utility function is wrong) but you might want to think about whether that is what you really want.
The alternative is to allow epistemic rationality to arise from instrumental rationality: obtaining truth is useful insofar as it improves your plans to obtain what you actually want.
Good point, I often find myself torn between epistemic rationality as a terminal value and its alternative. My thoughts are learning how to treat truth as the highest goal would be more useful to my career in physics and would be better for the world then if I currently steered to my closer less important values.
So treating truth as the highest goal serves your other, even higher goals?
What behaviors are encapsulated by the statement that you’re treating truth as the highest goal, and why can’t you just execute those behaviors anyway?
Better version: …then it has no effect on the world and therefore is not useful to have information about.
As to the rest of your post, I will make a general observation: you are speaking as if epistemic rationality is a terminal value. There’s nothing wrong with that (insofar as nobody can say someone else’s utility function is wrong) but you might want to think about whether that is what you really want.
The alternative is to allow epistemic rationality to arise from instrumental rationality: obtaining truth is useful insofar as it improves your plans to obtain what you actually want.
Good point, I often find myself torn between epistemic rationality as a terminal value and its alternative. My thoughts are learning how to treat truth as the highest goal would be more useful to my career in physics and would be better for the world then if I currently steered to my closer less important values.
So treating truth as the highest goal serves your other, even higher goals?
What behaviors are encapsulated by the statement that you’re treating truth as the highest goal, and why can’t you just execute those behaviors anyway?
Truth is the highest goal; being ‘right’ is a lower goal; improving a career in physics is the lowest goal.
Seeking truth is opposed to being ‘right’, but aligned with the career in physics.
If you’re justifying a terminal value, it’s not your real terminal value.
(Doesn’t strictly follow. People can and do justify things that they don’t need to justify all the time. It does imply confusion.)