But this is a contradiction, because both opposing views can’t be simultaneously more likely to be right than the other.
As stated, this is not a contradiction. Look:
A is arguing against B
A should assume that p
B should assume that ~p
That A has good reason to assume that p and B has good reason to assume that ~p is not contradictory; if they’re both rational, it just entails that they have access to different information.
And the fact that I believe that p gives me a reason to believe that p, all else being equal; after all, I can’t be expected to re-evaluate all of my beliefs at all times, and must assume that my reasons for starting to believe that p were good ones to begin with. (That doesn’t mean that this isn’t a good time to re-evaluate my reasons to believe that p, if they’re accessible to me)
But the truth can’t be dependent on which person you are, right? If we say you are Bob, or we say you are Carol, that doesn’t change whether there is life on Mars. Therefore what one should assume about the truth does not change merely based on which person they are, that was my reasoning. Put more technically, truth (about factual matters where people might disagree) is not “indexical”.
No, the truth isn’t dependent upon which person I am, but what I should believe isn’t directly dependent upon the truth (if it was, then there wouldn’t be any disagreement in the first place). Rather, what I believe is dependent upon what constitutes a good reason for me to believe something, and that is indexical.
But how could you know they should believe P about X while you should believe Q? You can’t think you are both doing what you should if you believe different things right?
If I was one of the disputants, then I would not know that the other person should believe P. Similarly, as an outside observer I would know that at least one of them is certainly incorrect (assuming your P and Q are inconsistent).
You’re changing the context without warrant.
If I’m in the situation then I’ll do what I should. I must then assume that the other person either has some reason to believe that Q, or that they’re being irrational, or some other handy explanation. By the principle of charity I should then perhaps assume they have a good reason to believe that Q and so I should re-evaluate my reasons for believing that P.
That doesn’t change that it’s not contradictory for me to prefer beliefs I already hold to beliefs I don’t, and to expect other people to follow the same rule of thumb. What alternative could there even be in the absence of a reason to re-evaluate your beliefs?
One other thing I’ll note as a problem with this as a heuristic (“prefer your own side in a disagreement”) is that more or less by definition, it’s going to be wrong at least 50% of the time (greater than 50% in the case where you’re both wrong). That’s not a really great heuristic.
But seriously, it’s not just a heuristic for disagreements. “One should prefer the beliefs one already has, all else equal” is a pretty good heuristic (a lot better than “one should have random beliefs” or “One should adopt beliefs one does not have, all else equal”).
My point was simply an answer to the question, “Now let us add one piece of information: you are one of the two people. Does this give you grounds to assume that your view is the one which is right?” If I’ve established that one does have such a reason in general about one’s beliefs, then the answer is clearly “yes”.
I’d agree that “in general, you should believe yourself” is a simpler rule than “in general, you should believe yourself, except when you come across someone else who has a different belief”. And simplicity is a plus. There are good reasons to prefer simple rules.
The question is whether this simplicity outweighs the theoretical arguments that greater accuracy can be attained by using the more complex rule. Perhaps someone who sufficiently values simplicity can reasonably argue for adopting the first rule.
ETA: Maybe I am wrong about the first rule: it should be “in general, you should believe yourself, except when you come across evidence that you are wrong”. And then the question is, how strong evidence is it to meet someone who came up with a different view. But this brings us back to the symmetry argument that that is actually a lot stronger evidence than most people imagine.
I think we may have exhausted any disagreement we actually had.
As I noted early on, I agree that coming across someone else with a different belief is a good occasion for re-evaluating one’s beliefs. From here, it will be hard to pin down a true substantive difference.
As stated, this is not a contradiction. Look:
A is arguing against B
A should assume that p
B should assume that ~p
That A has good reason to assume that p and B has good reason to assume that ~p is not contradictory; if they’re both rational, it just entails that they have access to different information.
And the fact that I believe that p gives me a reason to believe that p, all else being equal; after all, I can’t be expected to re-evaluate all of my beliefs at all times, and must assume that my reasons for starting to believe that p were good ones to begin with. (That doesn’t mean that this isn’t a good time to re-evaluate my reasons to believe that p, if they’re accessible to me)
But the truth can’t be dependent on which person you are, right? If we say you are Bob, or we say you are Carol, that doesn’t change whether there is life on Mars. Therefore what one should assume about the truth does not change merely based on which person they are, that was my reasoning. Put more technically, truth (about factual matters where people might disagree) is not “indexical”.
No, the truth isn’t dependent upon which person I am, but what I should believe isn’t directly dependent upon the truth (if it was, then there wouldn’t be any disagreement in the first place). Rather, what I believe is dependent upon what constitutes a good reason for me to believe something, and that is indexical.
But how could you know they should believe P about X while you should believe Q? You can’t think you are both doing what you should if you believe different things right?
If I was one of the disputants, then I would not know that the other person should believe P. Similarly, as an outside observer I would know that at least one of them is certainly incorrect (assuming your P and Q are inconsistent).
You’re changing the context without warrant.
If I’m in the situation then I’ll do what I should. I must then assume that the other person either has some reason to believe that Q, or that they’re being irrational, or some other handy explanation. By the principle of charity I should then perhaps assume they have a good reason to believe that Q and so I should re-evaluate my reasons for believing that P.
That doesn’t change that it’s not contradictory for me to prefer beliefs I already hold to beliefs I don’t, and to expect other people to follow the same rule of thumb. What alternative could there even be in the absence of a reason to re-evaluate your beliefs?
One other thing I’ll note as a problem with this as a heuristic (“prefer your own side in a disagreement”) is that more or less by definition, it’s going to be wrong at least 50% of the time (greater than 50% in the case where you’re both wrong). That’s not a really great heuristic.
You maybe.
But seriously, it’s not just a heuristic for disagreements. “One should prefer the beliefs one already has, all else equal” is a pretty good heuristic (a lot better than “one should have random beliefs” or “One should adopt beliefs one does not have, all else equal”).
My point was simply an answer to the question, “Now let us add one piece of information: you are one of the two people. Does this give you grounds to assume that your view is the one which is right?” If I’ve established that one does have such a reason in general about one’s beliefs, then the answer is clearly “yes”.
I’d agree that “in general, you should believe yourself” is a simpler rule than “in general, you should believe yourself, except when you come across someone else who has a different belief”. And simplicity is a plus. There are good reasons to prefer simple rules.
The question is whether this simplicity outweighs the theoretical arguments that greater accuracy can be attained by using the more complex rule. Perhaps someone who sufficiently values simplicity can reasonably argue for adopting the first rule.
ETA: Maybe I am wrong about the first rule: it should be “in general, you should believe yourself, except when you come across evidence that you are wrong”. And then the question is, how strong evidence is it to meet someone who came up with a different view. But this brings us back to the symmetry argument that that is actually a lot stronger evidence than most people imagine.
I think we may have exhausted any disagreement we actually had.
As I noted early on, I agree that coming across someone else with a different belief is a good occasion for re-evaluating one’s beliefs. From here, it will be hard to pin down a true substantive difference.