Didn’t Eliezer write something about how assuming that your ideological rivals must be defective or aberrant is a bad assumption to make? He phrased it in terms of “evil”, but I think the same principle applies to “stupid/insane”.
As for ignorant, well, isn’t almost tautologically true that we all believe people who hold beliefs that are incompatible with our own to be ignorant or mistaken?
Didn’t Eliezer write something about how assuming that your ideological rivals must be defective or aberrant is a bad assumption to make?
Yes, but sometimes that isn’t an assumption but a conclusion. I can think of a large number of ideological and non-ideological issues where I wouldn’t make that conclusion. Evolution is one where the conclusion seems easier (with the caveat that in the relevant quote “insane” is considered broad enough to mean “highly irrational and subject to cognitive biases in way almost all humans are about at least a few things”).
As for ignorant, well, isn’t almost tautologically true that we all believe people who hold beliefs that are incompatible with our own to be ignorant or mistaken?
There are degrees of how ignorant or mistaken someone can be. For example, Sniffnoy and I are coauthoring a pair of papers on integer complexity. There are certain conjectures we can’t prove that we have different opinions about whether they are true or false. I’m pretty sure that he and I are probably at this point in a set of 5 or 6 people on the planet who understand the relevant problems the most. So our disagreement doesn’t seem to be due to ignorance.
we have different opinions about whether they are true or false.
Probabilistic opinions?
Can you take a set of “unrelated” (the inapplicability of this term to math might make my suggestion worth very little) theorems known to be true or false and give your opinions about the chances they are true?
Also relevant are the costs of type I and type II errors in your paper...and your lives, as these may may have significantly conditioned your reactions to uncertainty.
Didn’t Eliezer write something about how assuming that your ideological rivals must be defective or aberrant is a bad assumption to make? He phrased it in terms of “evil”, but I think the same principle applies to “stupid/insane”.
As for ignorant, well, isn’t almost tautologically true that we all believe people who hold beliefs that are incompatible with our own to be ignorant or mistaken?
Yes, but sometimes that isn’t an assumption but a conclusion. I can think of a large number of ideological and non-ideological issues where I wouldn’t make that conclusion. Evolution is one where the conclusion seems easier (with the caveat that in the relevant quote “insane” is considered broad enough to mean “highly irrational and subject to cognitive biases in way almost all humans are about at least a few things”).
There are degrees of how ignorant or mistaken someone can be. For example, Sniffnoy and I are coauthoring a pair of papers on integer complexity. There are certain conjectures we can’t prove that we have different opinions about whether they are true or false. I’m pretty sure that he and I are probably at this point in a set of 5 or 6 people on the planet who understand the relevant problems the most. So our disagreement doesn’t seem to be due to ignorance.
Probabilistic opinions?
Can you take a set of “unrelated” (the inapplicability of this term to math might make my suggestion worth very little) theorems known to be true or false and give your opinions about the chances they are true?
Also relevant are the costs of type I and type II errors in your paper...and your lives, as these may may have significantly conditioned your reactions to uncertainty.