When I disagree with a rational man, I let reality be our final arbiter; if I am right, he will learn; if I am wrong, I will; one of us will win, but both will profit.
IME that’s the case in a sizeable fraction of disagreements between humans; but if they “let reality be [their] final arbiter” they ought to realize that in the process.
Perhaps, but it is rather unlikely that they are equally wrong. It is far more likely that one will be less wrong than the other. Indeed, improving on our knowledge by the comparison between such fractions of correctness would seem to be the whole point of Bayesian rationality.
I think that if the other person convinces you that they are right and they are right, then it should count as “winning the argument”. It’s the idea that has lost, not you.
Ayn Rand
Making the (flawed) assumption that in a disagreement, they cannot both be wrong.
Also, they could be wrong about whether they actually disagree.
IME that’s the case in a sizeable fraction of disagreements between humans; but if they “let reality be [their] final arbiter” they ought to realize that in the process.
I have also heard it quoted like this.
Perhaps, but it is rather unlikely that they are equally wrong. It is far more likely that one will be less wrong than the other. Indeed, improving on our knowledge by the comparison between such fractions of correctness would seem to be the whole point of Bayesian rationality.
I think that if the other person convinces you that they are right and they are right, then it should count as “winning the argument”. It’s the idea that has lost, not you.