My Bayesian wannabe paper is an argument against disagreement based on computation differences. You can “resolve” a disagreement by moving your opinion in the direction of the other opinion. If failing to do this reduces your average accuracy, I feel I can call that failure “irrational”.
It would be clearer if you said “epistemically irrational”. Instrumental rationality can be consistent with sticking to your guns—especially if your aim involves appearing to be exceptionally confident in your own views.
You can “resolve” a disagreement by moving your opinion in the direction of the other opinion. If failing to do this reduces your average accuracy, I feel I can call that failure “irrational”.
Do you have a suggestion for how much one should move one’s opinion in the direction of the other opinion, and an argument that doing so would improve average accuracy?
If you don’t have time for that, can you just explain what you mean by “average”? Average over what, using what distribution, and according to whose computation?
My Bayesian wannabe paper is an argument against disagreement based on computation differences. You can “resolve” a disagreement by moving your opinion in the direction of the other opinion. If failing to do this reduces your average accuracy, I feel I can call that failure “irrational”.
It would be clearer if you said “epistemically irrational”. Instrumental rationality can be consistent with sticking to your guns—especially if your aim involves appearing to be exceptionally confident in your own views.
Do you have a suggestion for how much one should move one’s opinion in the direction of the other opinion, and an argument that doing so would improve average accuracy?
If you don’t have time for that, can you just explain what you mean by “average”? Average over what, using what distribution, and according to whose computation?
How confident are you? How confident do you think your opponent is? Use those estimates to derive the distance you move.