So that leaves us with your objection to the view where we do try to maintain decisions, and find agent-dependent beliefs problematic. I’m not sure I understand your objection there, however. At least to some extent you seem to be pointing at external circumstances which might affect our decision, but my original comment already noted that external circumstances do also play a role rather than the agent’s decision being the sole determinant.
I… don’t know that I can explain my point any better than I already have.
Perhaps I should note that there’s a sense in which “beliefs determine our actions” which I find to be true but uninteresting (at least in this context). This is the utterly banal sense of “if I believe that it is raining outside, then I will bring an umbrella when I go for a walk”—i.e., the sense in which all of our actions are, in one way or another, determined by our beliefs.
Of course, there is nothing epistemologically challenging about this; it is just the ordinary, default state of affairs. You said:
An important framing here is “your beliefs determine your actions, so how do you get the beliefs which cause the best actions”.
If the result of thinking like this is that you decide to adopt false beliefs in order for “better actions” to (allegedly) result than if you had only true beliefs, then this is foolishness; but there is no epistemological challenge here—no difficulty for the project of epistemic rationality. Beyond that, nshepperd’s comments elsethread have dealt with this aspect of the matter, and I have little to add.
The (alleged) difficulty lies with beliefs which not just (allegedly) determine our decisions, but whose truth value is, in turn, determined by our decisions. (For example, “I will go to the beach this evening”.)
But I have shown how there is not, in fact, any difficulty with those beliefs after all.
You said:
A partial response to [nshepperd’s] criticisms would be that, even if you do have truth as the criterion for your beliefs, then this still leaves the truth value of a wide range of beliefs underdetermined;
But I have shown how this is not the case.
Thus your response to nshepperd’s criticisms, it seems, turns out to be invalid.
but there is no epistemological challenge here—no difficulty for the project of epistemic rationality. Beyond that, nshepperd’s comments elsethread have dealt with this aspect of the matter, and I have little to add.
The (alleged) difficulty lies with beliefs which not just (allegedly) determine our decisions, but whose truth value is, in turn, determined by our decisions. (For example, “I will go to the beach this evening”.)
But I have shown how there is not, in fact, any difficulty with those beliefs after all.
Hmm. You repeatedly use the word “difficulty”. Are you interpreting me to be saying that this would pose some kind of an insurmountable challenge for standard epistemic rationality? I was trying to say the opposite; that unlike what nshepperd was suggesting, this is perfectly in line and compatible with standard epistemic rationality.
A partial response to your criticisms would be that, even if you do have truth as the criterion for your beliefs, then this still leaves the truth value of a wide range of beliefs underdetermined; so one postrational approach would be to use truth as the primary criterion for forming beliefs, and then use other criteria for filling in the beliefs which the criterion of truth doesn’t help us distinguish between.
And I am saying that this position is wrong. I am saying that there is no special underdetermination. I am saying that there is no problem with using truth as the only criterion for beliefs. I am saying, therefore, that there is no reason to use any other criteria for belief, and that “postrationality” as you describe it is unmotivated.
Okay. I should probably give a more concrete example of what I mean.
First, here’s a totally ordinary situation that has nothing to do with postrationality: me deciding whether I want to have a Pepsi or a Coke. From my point of view as the decision-maker, there’s no epistemically correct or incorrect action; it’s up to my preferences to choose one or the other. From the point of view of instrumental rationality, there is of course a correct action: whichever drink best satisfies my preferences at that moment. But epistemic rationality does not tell me which one I should choose; that’s in the domain of instrumental rationality.
My claim is that there are analogous situations where the decision you are making is “what should I believe”, where epistemic rationality does not offer an opinion one way or the other; the only criteria comes from instrumental rationality, where it would be irrational not to choose the one that best fulfilled your preferences.
As an example of such a situation, say that you are about to give a talk to some audience. Let’s assume that you are basically well prepared, facing a non-hostile audience etc., so there is no external reason for why this talk would need to go badly. The one thing that most matters is how confident you are in giving the talk, which in turn depends on how you believe it will go:
If you believe that this talk will go badly, then you will be nervous and stammer, and this talk will go badly.
If you believe that this talk will go well, then you will be confident and focused on your message, and the talk will go well.
Suppose for the sake of example that you could just choose which belief you have, and also that you know what effects this will have. In that case, even though you are choosing which belief to have, from the point of view of epistemic rationality, they are both equally valid. If you choose to believe that the talk will go badly, then it will go badly, so the belief is epistemically valid; if you choose to believe that it will go well, then it will go well, so that belief is epistemically valid as well. The only criteria you get is the one from instrumental rationality: do you prefer your talk to go well, or do you prefer it to go badly?
From my point of view as the decision-maker, there’s no epistemically correct or incorrect action
I’m not sure I know what an “epistemically correct action” or an “epistemically incorrect action” is. Actions aren’t the kinds of things that can be “epistemically correct” or “epistemically incorrect”. This would seem to be a type error.
But epistemic rationality does not tell me which one I should choose; that’s in the domain of instrumental rationality.
Indeed…
My claim is that there are analogous situations where the decision you are making is “what should I believe”, where epistemic rationality does not offer an opinion one way or the other;
The epistemically correct belief is “the belief which is true” (or, of course, given uncertainty: “the belief which is most accurate, given available information”). This is always the case.
The one thing that most matters is how confident you are in giving the talk, which in turn depends on how you believe it will go
The correct belief, obviously, is:
“How the talk will go depends on how confident I am. If I am confident, then the talk will go well. If I am not confident, it will go badly.”
(Well, actually more like: “If I am confident, then the talk is more likely than not to go well. If I am not confident, it is is more likely than not to go badly.”)
Conditionalizing, I can then plug in my estimate of the probability that I will be confident.
If I am able to affect this probability—such as by deciding to be confident (if I have this ability), or by taking some other action (such as taking anxiolytic medication, imagining the audience naked, doing some exercise beforehand, etc.)—then, of course, I will do that.
I will then—if I feel like doing so—revise my probability estimate of my confidence, and, correspondingly, my probability estimate of the talk going well. Of course, this is not actually necessary, since it does not affect anything one way or the other.
Suppose for the sake of example that you could just choose which belief you have
As always, I would choose to have the most accurate belief, of course, as described above.
In that case, even though you are choosing which belief to have, from the point of view of epistemic rationality, they are both equally valid.
No, choosing to have any but the most accurate belief is epistemically incorrect.
The only criteria you get is the one from instrumental rationality: do you prefer your talk to go well, or do you prefer it to go badly?
Indeed not; our criterion is, as always, the one we get is the one from epistemic rationality, i.e. “have the most accurate beliefs”.
I… don’t know that I can explain my point any better than I already have.
Perhaps I should note that there’s a sense in which “beliefs determine our actions” which I find to be true but uninteresting (at least in this context). This is the utterly banal sense of “if I believe that it is raining outside, then I will bring an umbrella when I go for a walk”—i.e., the sense in which all of our actions are, in one way or another, determined by our beliefs.
Of course, there is nothing epistemologically challenging about this; it is just the ordinary, default state of affairs. You said:
If the result of thinking like this is that you decide to adopt false beliefs in order for “better actions” to (allegedly) result than if you had only true beliefs, then this is foolishness; but there is no epistemological challenge here—no difficulty for the project of epistemic rationality. Beyond that, nshepperd’s comments elsethread have dealt with this aspect of the matter, and I have little to add.
The (alleged) difficulty lies with beliefs which not just (allegedly) determine our decisions, but whose truth value is, in turn, determined by our decisions. (For example, “I will go to the beach this evening”.)
But I have shown how there is not, in fact, any difficulty with those beliefs after all.
You said:
But I have shown how this is not the case.
Thus your response to nshepperd’s criticisms, it seems, turns out to be invalid.
Hmm. You repeatedly use the word “difficulty”. Are you interpreting me to be saying that this would pose some kind of an insurmountable challenge for standard epistemic rationality? I was trying to say the opposite; that unlike what nshepperd was suggesting, this is perfectly in line and compatible with standard epistemic rationality.
I am referring to this bit:
And, of course, this bit:
And I am saying that this position is wrong. I am saying that there is no special underdetermination. I am saying that there is no problem with using truth as the only criterion for beliefs. I am saying, therefore, that there is no reason to use any other criteria for belief, and that “postrationality” as you describe it is unmotivated.
Okay. I should probably give a more concrete example of what I mean.
First, here’s a totally ordinary situation that has nothing to do with postrationality: me deciding whether I want to have a Pepsi or a Coke. From my point of view as the decision-maker, there’s no epistemically correct or incorrect action; it’s up to my preferences to choose one or the other. From the point of view of instrumental rationality, there is of course a correct action: whichever drink best satisfies my preferences at that moment. But epistemic rationality does not tell me which one I should choose; that’s in the domain of instrumental rationality.
My claim is that there are analogous situations where the decision you are making is “what should I believe”, where epistemic rationality does not offer an opinion one way or the other; the only criteria comes from instrumental rationality, where it would be irrational not to choose the one that best fulfilled your preferences.
As an example of such a situation, say that you are about to give a talk to some audience. Let’s assume that you are basically well prepared, facing a non-hostile audience etc., so there is no external reason for why this talk would need to go badly. The one thing that most matters is how confident you are in giving the talk, which in turn depends on how you believe it will go:
If you believe that this talk will go badly, then you will be nervous and stammer, and this talk will go badly.
If you believe that this talk will go well, then you will be confident and focused on your message, and the talk will go well.
Suppose for the sake of example that you could just choose which belief you have, and also that you know what effects this will have. In that case, even though you are choosing which belief to have, from the point of view of epistemic rationality, they are both equally valid. If you choose to believe that the talk will go badly, then it will go badly, so the belief is epistemically valid; if you choose to believe that it will go well, then it will go well, so that belief is epistemically valid as well. The only criteria you get is the one from instrumental rationality: do you prefer your talk to go well, or do you prefer it to go badly?
I’m not sure I know what an “epistemically correct action” or an “epistemically incorrect action” is. Actions aren’t the kinds of things that can be “epistemically correct” or “epistemically incorrect”. This would seem to be a type error.
Indeed…
The epistemically correct belief is “the belief which is true” (or, of course, given uncertainty: “the belief which is most accurate, given available information”). This is always the case.
The correct belief, obviously, is:
“How the talk will go depends on how confident I am. If I am confident, then the talk will go well. If I am not confident, it will go badly.”
(Well, actually more like: “If I am confident, then the talk is more likely than not to go well. If I am not confident, it is is more likely than not to go badly.”)
Conditionalizing, I can then plug in my estimate of the probability that I will be confident.
If I am able to affect this probability—such as by deciding to be confident (if I have this ability), or by taking some other action (such as taking anxiolytic medication, imagining the audience naked, doing some exercise beforehand, etc.)—then, of course, I will do that.
I will then—if I feel like doing so—revise my probability estimate of my confidence, and, correspondingly, my probability estimate of the talk going well. Of course, this is not actually necessary, since it does not affect anything one way or the other.
As always, I would choose to have the most accurate belief, of course, as described above.
No, choosing to have any but the most accurate belief is epistemically incorrect.
Indeed not; our criterion is, as always, the one we get is the one from epistemic rationality, i.e. “have the most accurate beliefs”.