This post does not seem to contribute much. As nawitus pointed out, this post does a good enough job of distinguishing between instrumental and epistemic rationality.
While it seems obvious that in some cases, a false belief will have greater utility than a true one (I can set up a contrived example if you need one), it’s a devil’s bargain. Once you’ve infected yourself with a belief that cannot respond to evidence, you will (most likely) end up getting the wrong answer on Very Important Problems.
And if you’ve already had your awakening as a rationalist, I’d like to think it would be impossible to make yourself honestly believe something that you know to be false.
On your latter point, do you really mean that in the thought experiment of someone wanting to shoot your friend and coming to you to ask for directions, you hope you couldn’t make yourself honestly believe falsehoods that you could then convey to the assassin thereby misdirecting him without seeming dishonest?
On your latter point, do you really mean that in the thought experiment of someone wanting to shoot your friend and coming to you to ask for directions, you hope you couldn’t make yourself honestly belief falsehoods that you could then convey to the assassin thereby misdirecting him without seeming dishonest?
Indeed. As long as we’re asking for superpowers, I’d prefer to have the ability to defeat the assassin, or to credibly lie to him without believing my lie.
Given that this situation is not going to happen to me, I’d rather keep the ability to distinguish truth from falsehood without epistemically poisoning myself.
This post does not seem to contribute much. As nawitus pointed out, this post does a good enough job of distinguishing between instrumental and epistemic rationality.
While it seems obvious that in some cases, a false belief will have greater utility than a true one (I can set up a contrived example if you need one), it’s a devil’s bargain. Once you’ve infected yourself with a belief that cannot respond to evidence, you will (most likely) end up getting the wrong answer on Very Important Problems.
And if you’ve already had your awakening as a rationalist, I’d like to think it would be impossible to make yourself honestly believe something that you know to be false.
Yes, the irony in the last statement is intended.
I did not hypothetize about infecting oneself with a belief that doesn’t respond to evidence.
The kind of hypothetical faith I spoke of would respond to evidence; evidence of what is conducive to being able to act according to one’s values.
In that case, the following is misleading:
At any rate, a belief that would not respond to evidence of its truth or falsehood would be sufficiently malign.
Ah, yes, you’re correct. That was poorly written.
On your latter point, do you really mean that in the thought experiment of someone wanting to shoot your friend and coming to you to ask for directions, you hope you couldn’t make yourself honestly believe falsehoods that you could then convey to the assassin thereby misdirecting him without seeming dishonest?
Indeed. As long as we’re asking for superpowers, I’d prefer to have the ability to defeat the assassin, or to credibly lie to him without believing my lie.
Given that this situation is not going to happen to me, I’d rather keep the ability to distinguish truth from falsehood without epistemically poisoning myself.