Hmm, maybe I shouldn’t have said “always” given that acting ability is required to signal a belief you don’t hold, but I do think what I suggest is the ideal. I think someone who trained themselves to do what I suggest, by studying people skills and so forth, would do better as they’d get the social benefits of conformity and without the disadvantages of false beliefs clouding predictions (though admittedly the time investment of learning these skills would have to be considered).
Short version: I think this is possible with training and would make you “win” more often, and thus it’s what a rationalist would do (unless the cost of training proved prohibitive, of which I’m doubtful since these skills are very transferable).
I’m not sure what you meant by the magisteria remark, but I get the impression that advocating spiritual/long-term beliefs to less stringent standards than short term ones isn’t generally seen as a good thing (see Eliezer’s “Outside the Laboratory” post among others).
Hmm, maybe I shouldn’t have said “always” given that acting ability is required to signal a belief you don’t hold, but I do think what I suggest is the ideal. I think someone who trained themselves to do what I suggest, by studying people skills and so forth, would do better as they’d get the social benefits of conformity and without the disadvantages of false beliefs clouding predictions (though admittedly the time investment of learning these skills would have to be considered).
Short version: I think this is possible with training and would make you “win” more often, and thus it’s what a rationalist would do (unless the cost of training proved prohibitive, of which I’m doubtful since these skills are very transferable).
I’m not sure what you meant by the magisteria remark, but I get the impression that advocating spiritual/long-term beliefs to less stringent standards than short term ones isn’t generally seen as a good thing (see Eliezer’s “Outside the Laboratory” post among others).