Evolution favors the attitudes that make us most likely to produce viable offspring. If this is one’s own main goal, then I suppose logical fallacies should be accepted if they have a clear evolutionary basis and still seem likely to contribute to that goal. However, whether or not it’s efficient to place reproduction as one’s top priority depends on various circumstances, including emotions. From what I’ve read by Eliezer Yudkowsky, it seems like being accurate in his ideas is more important to him. In that situation, just because a belief helps us survive long enough to reproduce does not mean that it is “useful,” and “criticizing evolution” isn’t really what he’s doing. Evolution /isn’t/ a designer, and it /isn’t/ always completely efficient (not that any designer is), but even if it is completely efficient in this case, the efficiency is towards a goal he does not share, so it isn’t necessarily relevant to him.
Evolution favors the attitudes that make us most likely to produce viable offspring. If this is one’s own main goal, then I suppose logical fallacies should be accepted if they have a clear evolutionary basis and still seem likely to contribute to that goal. However, whether or not it’s efficient to place reproduction as one’s top priority depends on various circumstances, including emotions. From what I’ve read by Eliezer Yudkowsky, it seems like being accurate in his ideas is more important to him. In that situation, just because a belief helps us survive long enough to reproduce does not mean that it is “useful,” and “criticizing evolution” isn’t really what he’s doing. Evolution /isn’t/ a designer, and it /isn’t/ always completely efficient (not that any designer is), but even if it is completely efficient in this case, the efficiency is towards a goal he does not share, so it isn’t necessarily relevant to him.