It’s imaginable that Hitler might have discovered that Jews are people after all, if he had been just slightly more rational and spotted a flaw in his racist ideology. It’s also imaginable that Hitler might have been tricked into believing that his racial ideas were wrong, if he had been just slightly less rational and unable to spot the fallacy in the ideas of someone who objected to his racist policies.
It’s important that we recognize both of these as realistic possibilities.
If someone is insufficiently rational to spot the problems in an argument against genocide, they’ll also be insufficiently rational to spot the problems in an argument in favor of genocide.
How does that follow? Certainly, “if someone is insufficiently rational to spot the problems with an argument for ~X, they are insufficiently rational to spot the problems with an argument for X” is not true in the general case.
It’s possible that being more intelligent will make you go from a true position to a false position, but it’s not something that will happen consistently. If you want someone to be more likely to believe a true thing, it’s better to make them smarter rather than stupider.
It’s imaginable that Hitler might have discovered that Jews are people after all, if he had been just slightly more rational and spotted a flaw in his racist ideology. It’s also imaginable that Hitler might have been tricked into believing that his racial ideas were wrong, if he had been just slightly less rational and unable to spot the fallacy in the ideas of someone who objected to his racist policies.
It’s important that we recognize both of these as realistic possibilities.
If someone is insufficiently rational to spot the problems in an argument against genocide, they’ll also be insufficiently rational to spot the problems in an argument in favor of genocide.
How does that follow? Certainly, “if someone is insufficiently rational to spot the problems with an argument for ~X, they are insufficiently rational to spot the problems with an argument for X” is not true in the general case.
It’s possible that being more intelligent will make you go from a true position to a false position, but it’s not something that will happen consistently. If you want someone to be more likely to believe a true thing, it’s better to make them smarter rather than stupider.
I agree with this, but this is a more nuanced position than what Yudkowsky’s above words express.