From my point of view as the decision-maker, there’s no epistemically correct or incorrect action
I’m not sure I know what an “epistemically correct action” or an “epistemically incorrect action” is. Actions aren’t the kinds of things that can be “epistemically correct” or “epistemically incorrect”. This would seem to be a type error.
But epistemic rationality does not tell me which one I should choose; that’s in the domain of instrumental rationality.
Indeed…
My claim is that there are analogous situations where the decision you are making is “what should I believe”, where epistemic rationality does not offer an opinion one way or the other;
The epistemically correct belief is “the belief which is true” (or, of course, given uncertainty: “the belief which is most accurate, given available information”). This is always the case.
The one thing that most matters is how confident you are in giving the talk, which in turn depends on how you believe it will go
The correct belief, obviously, is:
“How the talk will go depends on how confident I am. If I am confident, then the talk will go well. If I am not confident, it will go badly.”
(Well, actually more like: “If I am confident, then the talk is more likely than not to go well. If I am not confident, it is is more likely than not to go badly.”)
Conditionalizing, I can then plug in my estimate of the probability that I will be confident.
If I am able to affect this probability—such as by deciding to be confident (if I have this ability), or by taking some other action (such as taking anxiolytic medication, imagining the audience naked, doing some exercise beforehand, etc.)—then, of course, I will do that.
I will then—if I feel like doing so—revise my probability estimate of my confidence, and, correspondingly, my probability estimate of the talk going well. Of course, this is not actually necessary, since it does not affect anything one way or the other.
Suppose for the sake of example that you could just choose which belief you have
As always, I would choose to have the most accurate belief, of course, as described above.
In that case, even though you are choosing which belief to have, from the point of view of epistemic rationality, they are both equally valid.
No, choosing to have any but the most accurate belief is epistemically incorrect.
The only criteria you get is the one from instrumental rationality: do you prefer your talk to go well, or do you prefer it to go badly?
Indeed not; our criterion is, as always, the one we get is the one from epistemic rationality, i.e. “have the most accurate beliefs”.
I’m not sure I know what an “epistemically correct action” or an “epistemically incorrect action” is. Actions aren’t the kinds of things that can be “epistemically correct” or “epistemically incorrect”. This would seem to be a type error.
Indeed…
The epistemically correct belief is “the belief which is true” (or, of course, given uncertainty: “the belief which is most accurate, given available information”). This is always the case.
The correct belief, obviously, is:
“How the talk will go depends on how confident I am. If I am confident, then the talk will go well. If I am not confident, it will go badly.”
(Well, actually more like: “If I am confident, then the talk is more likely than not to go well. If I am not confident, it is is more likely than not to go badly.”)
Conditionalizing, I can then plug in my estimate of the probability that I will be confident.
If I am able to affect this probability—such as by deciding to be confident (if I have this ability), or by taking some other action (such as taking anxiolytic medication, imagining the audience naked, doing some exercise beforehand, etc.)—then, of course, I will do that.
I will then—if I feel like doing so—revise my probability estimate of my confidence, and, correspondingly, my probability estimate of the talk going well. Of course, this is not actually necessary, since it does not affect anything one way or the other.
As always, I would choose to have the most accurate belief, of course, as described above.
No, choosing to have any but the most accurate belief is epistemically incorrect.
Indeed not; our criterion is, as always, the one we get is the one from epistemic rationality, i.e. “have the most accurate beliefs”.