If someone is insufficiently rational to spot the problems in an argument against genocide, they’ll also be insufficiently rational to spot the problems in an argument in favor of genocide.
How does that follow? Certainly, “if someone is insufficiently rational to spot the problems with an argument for ~X, they are insufficiently rational to spot the problems with an argument for X” is not true in the general case.
It’s possible that being more intelligent will make you go from a true position to a false position, but it’s not something that will happen consistently. If you want someone to be more likely to believe a true thing, it’s better to make them smarter rather than stupider.
If someone is insufficiently rational to spot the problems in an argument against genocide, they’ll also be insufficiently rational to spot the problems in an argument in favor of genocide.
How does that follow? Certainly, “if someone is insufficiently rational to spot the problems with an argument for ~X, they are insufficiently rational to spot the problems with an argument for X” is not true in the general case.
It’s possible that being more intelligent will make you go from a true position to a false position, but it’s not something that will happen consistently. If you want someone to be more likely to believe a true thing, it’s better to make them smarter rather than stupider.
I agree with this, but this is a more nuanced position than what Yudkowsky’s above words express.