Suppose you have an extremely high prior probability for God sending doubters to Hell, for whatever reason. Presumably the utility of going to Hell is very, very low. Then, as a rational Bayesian, you should avoid any evidence that would tend to cause you to doubt God, shouldn’t you?
This is a preference over rituals of cognition, choosing not just decisions, but the algorithms with which you arrive at those decisions. It is usually assumed that only the decisions matter, not the thought process. If you did live in such a world, I agree, you should avoid getting into a doubt-state, although it might be the case that you’d benefit from building an external reasoning device that would resolve the problem for you, not being hindered by limitations on the allowed cognitive algorithms.
Also, I guess that an altruistic person should still undergo a conversion to rationality, on the chance that the evidence points out that the inborn priors are incorrect, thus sparing his fellow people living under such limitations on thought.
Well, if you’re altruistic in the sense you describe, you don’t have the utility function I gave in my scenario, so your result will vary. If you don’t really mind going to hell too much, comparatively, then the argument doesn’t work well.
This is a preference over rituals of cognition, choosing not just decisions, but the algorithms with which you arrive at those decisions. It is usually assumed that only the decisions matter, not the thought process. If you did live in such a world, I agree, you should avoid getting into a doubt-state, although it might be the case that you’d benefit from building an external reasoning device that would resolve the problem for you, not being hindered by limitations on the allowed cognitive algorithms.
Also, I guess that an altruistic person should still undergo a conversion to rationality, on the chance that the evidence points out that the inborn priors are incorrect, thus sparing his fellow people living under such limitations on thought.
Well, if you’re altruistic in the sense you describe, you don’t have the utility function I gave in my scenario, so your result will vary. If you don’t really mind going to hell too much, comparatively, then the argument doesn’t work well.
Of course.