I think it’s fairly obvious that “maximizing truth” meant “maximizing the correlation between my beliefs and truth”.
Having accurate beliefs is an incredibly useful thing. You may well find it serves your utility better.
Truth is overrated. My prior was heavily biased toward truth, but then a brief and unpleasant encounter with nihilism caused me to lower my estimate.
And before you complain that this doesn’t make any sense either, let me spell out that is an estimate of the probability that the strategy “pursue truth first, happiness second” yields, on average, more hedons than “pursue happiness using the current set of beliefs”.
I think it’s fairly obvious that “maximizing truth” meant “maximizing the correlation between my beliefs and truth”.
Truth is overrated. My prior was heavily biased toward truth, but then a brief and unpleasant encounter with nihilism caused me to lower my estimate.
And before you complain that this doesn’t make any sense either, let me spell out that is an estimate of the probability that the strategy “pursue truth first, happiness second” yields, on average, more hedons than “pursue happiness using the current set of beliefs”.