How should we deal with the cases when epistemic rationality contradicts instrumental? For example, we may want to use placebo effect because one of our values is that healthy is better than sick, and less pain is better than more pain. But placebo effect is based on the fact that we believe pill to be a working medicine that is wrong. Is there any way to satisfy both epistemic and instrumental rationality?
It depends from case to case, I would think. There are instances when you’re most probably benefited by trading off epistemic rationality for instrumental, but in cases where it’s too chaotic to get a good estimate and the tradeoff seems close to equal, I would personally err on the side of epistemic rationality. Brains are complicated, forcing a placebo effect might have ripple effects across your psyche like an increased tendency to shut down that voice in your head that talks when you know your belief is wrong on some level (very speculative example), for limited short-term gain.
It seems to me that this is not a contradiction of two rationalities. Rather, it is similar to the resonance of doubt. If a placebo works when you believe in it, that means that if you believe in it, it will be true. Here you need a reverse example, when if you believe that something is true, then it becomes false. (Believing that something is safe again won’t work, since you just need to not act more carelessly based on the safety of something, which is just a matter of instrumental rationality)
If you believe that the placebo works, it works. You’re right in believing it works. If you don’t believe that the placebo works, it doesn’t work. You’re right believing it doesn’t work
If you believe that the sky is blue, you’re right. If you believe that the sky is green, it’s still blue, you’re wrong.
Truths that have humans involve some amounts of reflexivity.
I’d say you shouldn’t force yourself to believe something (epistemic rationality) to achieve a goal (instrumental rationality). This is because, in my view, human minds are addicted to feeling consistent, so it’d be very difficult (i.e., resource expensive) to believe a drug works when you know it doesn’t.
What does it even mean to believe something is true when you know it’s false? I don’t know. Whatever it means, it’d have to be a psychological thing rather than an epistemological one. My personal recommendation is to only believe things that are true. This is because the modern environment we live in generally benefits rational behavior based on knowledge anyway, so the problem doesn’t need to surface.
How should we deal with the cases when epistemic rationality contradicts instrumental? For example, we may want to use placebo effect because one of our values is that healthy is better than sick, and less pain is better than more pain. But placebo effect is based on the fact that we believe pill to be a working medicine that is wrong. Is there any way to satisfy both epistemic and instrumental rationality?
It depends from case to case, I would think. There are instances when you’re most probably benefited by trading off epistemic rationality for instrumental, but in cases where it’s too chaotic to get a good estimate and the tradeoff seems close to equal, I would personally err on the side of epistemic rationality. Brains are complicated, forcing a placebo effect might have ripple effects across your psyche like an increased tendency to shut down that voice in your head that talks when you know your belief is wrong on some level (very speculative example), for limited short-term gain.
Thank you, wonderful series!
It seems to me that this is not a contradiction of two rationalities. Rather, it is similar to the resonance of doubt. If a placebo works when you believe in it, that means that if you believe in it, it will be true. Here you need a reverse example, when if you believe that something is true, then it becomes false. (Believing that something is safe again won’t work, since you just need to not act more carelessly based on the safety of something, which is just a matter of instrumental rationality)
If you believe that the placebo works, it works. You’re right in believing it works.
If you don’t believe that the placebo works, it doesn’t work. You’re right believing it doesn’t work
If you believe that the sky is blue, you’re right.
If you believe that the sky is green, it’s still blue, you’re wrong.
Truths that have humans involve some amounts of reflexivity.
I’d say you shouldn’t force yourself to believe something (epistemic rationality) to achieve a goal (instrumental rationality). This is because, in my view, human minds are addicted to feeling consistent, so it’d be very difficult (i.e., resource expensive) to believe a drug works when you know it doesn’t.
What does it even mean to believe something is true when you know it’s false? I don’t know. Whatever it means, it’d have to be a psychological thing rather than an epistemological one. My personal recommendation is to only believe things that are true. This is because the modern environment we live in generally benefits rational behavior based on knowledge anyway, so the problem doesn’t need to surface.