If you value truth intrinsically, then reducing your ability to approach it would hurt you, so I think my analysis is still applicable to some extent.
But you are probably right, since we are running into the issue of implicit goals. If I am a paperclip maximizer, then, from my point of view, any action that reduces the projected number of future paperclips in the world is immoral, and there’s probably nothing you can do to convince me otherwise. Similarly, if you value truth as a goal in and of itself, regardless of its instrumental value; then your morality may be completely incompatible with the morality of someone who (for example) only values truth as a means to an end (i.e., achieving his other goals).
I have to admit that don’t know how to resolve this problem, or whether it has a resolution at all.
If you value truth intrinsically, then reducing your ability to approach it would hurt you, so I think my analysis is still applicable to some extent.
But you are probably right, since we are running into the issue of implicit goals. If I am a paperclip maximizer, then, from my point of view, any action that reduces the projected number of future paperclips in the world is immoral, and there’s probably nothing you can do to convince me otherwise. Similarly, if you value truth as a goal in and of itself, regardless of its instrumental value; then your morality may be completely incompatible with the morality of someone who (for example) only values truth as a means to an end (i.e., achieving his other goals).
I have to admit that don’t know how to resolve this problem, or whether it has a resolution at all.