but now I can bump my estimate back up. There is at least one belief which my tribe elevates to the rank of scientific fact, yet which I think is probably wrong: I do not believe in the Big Bang.
I don’t think we can reasonably elevate our estimate of our own rationality by observing that we disagree with the consensus of a respected community.
Second, the background radiation which is said to be leftover stray photons from the big bang. If the background radiation was a prediction of Big Bang theory, then I might have been convinced by this experimental evidence, but in fact the background radiation was discovered by accident. Only afterwards did the proponents of Big Bang theory retrofit it as a prediction of their model.
I am wary of this kind of argument. I should not be able to discredit a theory by the act of collecting all possible evidence and publishing before they have a chance to think things through.
I don’t think we can reasonably elevate our estimate of our own rationality by observing that we disagree with the consensus of a respected community.
But isn’t Eliezer suggesting, in this very post, that we should use uncommon justified beliefs as an indicator that people are actually thinking for themselves as opposed to copying the beliefs of the community? I would assume that the standards we use to judge others should also apply when judging ourselves.
On the other hand, what you’re saying sounds reasonable too. After all, crackpots also disagree with the consensus of a respected community.
The point is that there could be many reasons why a person would disagree with a respected community, one of which is that the person is actually being rational and that the community is wrong. Or, as seems to be the case here, that the person is actually being rational but hasn’t yet encountered all the evidence which the community has. In any case, given the fact that I’m here, following a website dedicated to the art of rationality, I think that in this case rationality is quite a likely cause for my disagreement.
I should not be able to discredit a theory by the act of collecting all possible evidence and publishing before they have a chance to think things through.
I agree that if a piece of evidence is published before it is predicted, this is not evidence against the theory, but it does weaken the prediction considerably. Therefore, please don’t publish this entire collection of all possible evidence, as it will make it much harder afterwards to distinguish between theories!
“But isn’t Eliezer suggesting, in this very post, that we should use uncommon justified beliefs as an indicator that people are actually thinking for themselves as opposed to copying the beliefs of the community? I would assume that the standards we use to judge others should also apply when judging ourselves.
On the other hand, what you’re saying sounds reasonable too. After all, crackpots also disagree with the consensus of a respected community.”
Eliezer didn’t say that we should use “disagreeing with the consensus of a respected community” as an indicator of rationality. He said that we should use disagreeing with the consensus of one’s own community as an indicator of rationality.
I don’t think we can reasonably elevate our estimate of our own rationality by observing that we disagree with the consensus of a respected community.
I am wary of this kind of argument. I should not be able to discredit a theory by the act of collecting all possible evidence and publishing before they have a chance to think things through.
But isn’t Eliezer suggesting, in this very post, that we should use uncommon justified beliefs as an indicator that people are actually thinking for themselves as opposed to copying the beliefs of the community? I would assume that the standards we use to judge others should also apply when judging ourselves.
On the other hand, what you’re saying sounds reasonable too. After all, crackpots also disagree with the consensus of a respected community.
The point is that there could be many reasons why a person would disagree with a respected community, one of which is that the person is actually being rational and that the community is wrong. Or, as seems to be the case here, that the person is actually being rational but hasn’t yet encountered all the evidence which the community has. In any case, given the fact that I’m here, following a website dedicated to the art of rationality, I think that in this case rationality is quite a likely cause for my disagreement.
I agree that if a piece of evidence is published before it is predicted, this is not evidence against the theory, but it does weaken the prediction considerably. Therefore, please don’t publish this entire collection of all possible evidence, as it will make it much harder afterwards to distinguish between theories!
“But isn’t Eliezer suggesting, in this very post, that we should use uncommon justified beliefs as an indicator that people are actually thinking for themselves as opposed to copying the beliefs of the community? I would assume that the standards we use to judge others should also apply when judging ourselves.
On the other hand, what you’re saying sounds reasonable too. After all, crackpots also disagree with the consensus of a respected community.”
Eliezer didn’t say that we should use “disagreeing with the consensus of a respected community” as an indicator of rationality. He said that we should use disagreeing with the consensus of one’s own community as an indicator of rationality.