The difficulty for me is that this technique is at war with having an accurate self-concept, and may conflict with good epistemic hygiene generally. For the program to work, one must seemingly learn to suppress one’s critical faculties for selected cases of wishful thinking. This runs against trying to be just the right amount critical when faced with propositions in general. How can someone who is just the right amount critical affirm things that are probably not true?
The difficulty for me is that this technique is at war with having an accurate self-concept, and may conflict with good epistemic hygiene generally.
Generally, I see no conflict here, assuming that the thing you’re priming yourself with is not something that might displace your core rationalist foundations.
If you’re riding a horse, it is epistemically rational to incorporate the knowledge about the horse into your model of the world (to be aware how it will react to a pack of wolves or an attractive mare during a mating season), and it is instrumentally rational to be able to steer the horse where you want it to carry you.
Same with your mind—if you’re riding an evolutionary kludge, it is epistemically rational to incorporate the knowledge about the kludge it into your map of reality, and it is instrumentally rational to be able to steer it where you want it to be.
What matters is where you draw the line between the agent and the environment.
The difficulty for me is that this technique is at war with having an accurate self-concept, and may conflict with good epistemic hygiene generally. For the program to work, one must seemingly learn to suppress one’s critical faculties for selected cases of wishful thinking. This runs against trying to be just the right amount critical when faced with propositions in general. How can someone who is just the right amount critical affirm things that are probably not true?
Generally, I see no conflict here, assuming that the thing you’re priming yourself with is not something that might displace your core rationalist foundations.
If you’re riding a horse, it is epistemically rational to incorporate the knowledge about the horse into your model of the world (to be aware how it will react to a pack of wolves or an attractive mare during a mating season), and it is instrumentally rational to be able to steer the horse where you want it to carry you.
Same with your mind—if you’re riding an evolutionary kludge, it is epistemically rational to incorporate the knowledge about the kludge it into your map of reality, and it is instrumentally rational to be able to steer it where you want it to be.
What matters is where you draw the line between the agent and the environment.
Is an actor practicing poor epistemic hygiene when they play a role?
Refraining from dispute is not the same thing as believing. Not discussing religion with your theist friends is not the same as becoming one yourself.
Review ‘Not’