A person doesn’t have to ‘infinitely value’ truth to always prefer the truth to a lie. The importance put on truth merely has to be greater than the importance put on anything else.
That said, if the question is, is there a human, or has there ever been a human who values truth more than anything else, the answer is almost certainly no. For example, I care about the truth a lot, but if I were given the choice between learning a single, randomly chosen fact about the universe, and being given a million dollars, I’d pick the cash without too much hesitation.
However, as Eliezer has said many times, human minds only represent a tiny fraction of all possible minds. A mind that puts truth above anything else is certainly possible, even if it doesn’t exist yet.
Now that we know we programmed an AI that may lie to us, our rational expectations will make us skeptical of what the AI says, which is not ideal. Sounds like the AI programmer will have to cover up the fact that the AI does not always speak the truth.
A person doesn’t have to ‘infinitely value’ truth to always prefer the truth to a lie. The importance put on truth merely has to be greater than the importance put on anything else.
That said, if the question is, is there a human, or has there ever been a human who values truth more than anything else, the answer is almost certainly no. For example, I care about the truth a lot, but if I were given the choice between learning a single, randomly chosen fact about the universe, and being given a million dollars, I’d pick the cash without too much hesitation.
However, as Eliezer has said many times, human minds only represent a tiny fraction of all possible minds. A mind that puts truth above anything else is certainly possible, even if it doesn’t exist yet.
Now that we know we programmed an AI that may lie to us, our rational expectations will make us skeptical of what the AI says, which is not ideal. Sounds like the AI programmer will have to cover up the fact that the AI does not always speak the truth.