How one responds to this dilemma depends on how one values truth. I get the impression that while you value belief in truth, you can imagine that the maximum amount of long-term utility for belief in a falsehood is greater than the minimum amount of long-term utility for belief in a true fact. I would not be surprised to see that many others here feel the same way. After all, there’s nothing inherently wrong with thinking this is so.
However, my value system is such that the value of knowing the truth greatly outweighs any possible gains you might have from honestly believing a falsehood. I would reject being hooked up to Nozick’s experience machine on utilitarian grounds: I honestly value the disutility of believing in a falsehood to be that bad*.
(I am wary of putting the word “any” in the above paragraph, as maybe I’m not correctly valuing very large numbers of utilons. I’m not really sure how to evaluate differences in utility when it comes to things I really value, like belief in true facts. The value is so high in these cases that it’s hard to see how anything could possibly exceed it, but maybe this is just because I have no understanding of how to properly value high value things.)
I am skeptical. Do you spend literally all of your time and resources on increasing the accuracy of your beliefs, or do you also spend some on some other form of enjoyment?
Yet I would maintain that belief in true facts, when paired with other things I value, is what I place high value on. If I pair those other things I value with belief in falsehoods, their overall value is much, much less. In this way, I maintain a very high value in belief of true facts while not committing myself to maximize accuracy like paper clips.
(Note that I’m confabulating here; the above paragraph is my attempt to salvage my intuitive beliefs, and is not indicative of how I originally formulated them. Nevertheless, I’m warily submitting them as my updated beliefs after reading your comment.)
Okay, so if your utilities are configured that way, the false belief might be a belief you will encounter, struggle with, and get over in a few years, and be stronger for the experience.
For that matter, the truth might be ‘your world is, in fact, a simulation of your own design, to which you have (through carelessness) forgotten the control codes; you are thus trapped and will die here, accomplishing nothing in the real world’. Obviously an extreme example; but if it is true, you probably do not want to know it.
How one responds to this dilemma depends on how one values truth. I get the impression that while you value belief in truth, you can imagine that the maximum amount of long-term utility for belief in a falsehood is greater than the minimum amount of long-term utility for belief in a true fact. I would not be surprised to see that many others here feel the same way. After all, there’s nothing inherently wrong with thinking this is so.
However, my value system is such that the value of knowing the truth greatly outweighs any possible gains you might have from honestly believing a falsehood. I would reject being hooked up to Nozick’s experience machine on utilitarian grounds: I honestly value the disutility of believing in a falsehood to be that bad*.
(I am wary of putting the word “any” in the above paragraph, as maybe I’m not correctly valuing very large numbers of utilons. I’m not really sure how to evaluate differences in utility when it comes to things I really value, like belief in true facts. The value is so high in these cases that it’s hard to see how anything could possibly exceed it, but maybe this is just because I have no understanding of how to properly value high value things.)
I am skeptical. Do you spend literally all of your time and resources on increasing the accuracy of your beliefs, or do you also spend some on some other form of enjoyment?
Point taken.
Yet I would maintain that belief in true facts, when paired with other things I value, is what I place high value on. If I pair those other things I value with belief in falsehoods, their overall value is much, much less. In this way, I maintain a very high value in belief of true facts while not committing myself to maximize accuracy like paper clips.
(Note that I’m confabulating here; the above paragraph is my attempt to salvage my intuitive beliefs, and is not indicative of how I originally formulated them. Nevertheless, I’m warily submitting them as my updated beliefs after reading your comment.)
Okay, so if your utilities are configured that way, the false belief might be a belief you will encounter, struggle with, and get over in a few years, and be stronger for the experience.
For that matter, the truth might be ‘your world is, in fact, a simulation of your own design, to which you have (through carelessness) forgotten the control codes; you are thus trapped and will die here, accomplishing nothing in the real world’. Obviously an extreme example; but if it is true, you probably do not want to know it.