“Oh, look, Eliezer is overconfident because he believes in many-worlds.”
I can agree that this is absolutely nonsensical reasoning. The correct reason to believe Eliezer is overconfident is because he’s a human being, and the prior that any given human is overconfident is extremely large.
One might propose heuristics to determine whether person X is more or less overconfident, but “X disagrees strongly with me personally on this controversial issue, therefore he is overconfident” (or stupid or ignorant) is the exact type of flawed reasoning that comes from self-serving biases.
“Oh, look, Eliezer is overconfident because he believes in many-worlds.”
I can agree that this is absolutely nonsensical reasoning. The correct reason to believe Eliezer is overconfident is because he’s a human being, and the prior that any given human is overconfident is extremely large.
One might propose heuristics to determine whether person X is more or less overconfident, but “X disagrees strongly with me personally on this controversial issue, therefore he is overconfident” (or stupid or ignorant) is the exact type of flawed reasoning that comes from self-serving biases.