I’m still deeply troubled by the focus on labels “rational” and now “Bayesian”, rather than “winning”, “predicting”, or “correct”.
For epistemic rationality, focus on truth rather than rationality: do these beliefs map to actual contingent states of the universe? Especially for human-granularity beliefs, Bayesian reasoning is really difficult, because it’s unlikely for you to know your priors in any precise way.
For instrumental rationality, focus on decisions: are the actions I’m taking based on these beliefs likely to improve my future experiences?
I don’t understand what does it mean, even after a google search, so please enlighten me.
For epistemic rationality
I think so. I think she has exhausted all the possible avenue to reach the truth. So she is epistemically rational. Do you agree?
For instrumental rationality
Now this is confusing to me as well. Let us forget about the extension for the moment and focus solely on the narrative as presented in the OP. I am not familiar how does value and rationality goes together, but, I think there is nothing wrong if her value is “Adam’s innocence” and that it is inherently valuable, and end to it self. Am my making any mistake in my train of thought?
By human-granularity, I mean beliefs about macro states that can be analyzed and manipulated by human thought and expressed in reasonable amounts (say, less than a few hundred pages of text) of human language. As contrasted with pure analytic beliefs about the state of the universe expressed numerically.
For instrumental rationality, what goals are furthered by her knowing the truth of this fact? Presuming that if Adam is innocent, she wants to believe that Adam is innocent and if Adam is guilty, she wants to believe Adam is guilty, why does she want to be correct (beyond “I like being right”)? What decision will she make based on it?
why does she want to be correct (beyond “I like being right”)?
I think that’s it. “I like knowing that the person I love is innocent.” Which implies that Adam is not lying to her and “I like being in healthy, fulfilling and genuine marital relationship”
That’s a reason to want him to be innocent, not a reason to want to know the truth. What’s her motivation for the necessary second part of the litany: “if Adam is guilty, I want to believe that Adam is guilty”?
That just moves it up a level. If she is rational, she’ll say “if our relationship was genuine, I want to believe it was genuine. If our relationship was not genuine, I want to believe it was not genuine”.
The OP and most of the discussion has missed the fundamental premise of rationality: truth-seeking. The question is not “is Eve rational”, but “is Eve’s belief (including acknowledgement of uncertainty) correct”?
I’m still deeply troubled by the focus on labels “rational” and now “Bayesian”, rather than “winning”, “predicting”, or “correct”.
For epistemic rationality, focus on truth rather than rationality: do these beliefs map to actual contingent states of the universe? Especially for human-granularity beliefs, Bayesian reasoning is really difficult, because it’s unlikely for you to know your priors in any precise way.
For instrumental rationality, focus on decisions: are the actions I’m taking based on these beliefs likely to improve my future experiences?
I don’t understand what does it mean, even after a google search, so please enlighten me.
I think so. I think she has exhausted all the possible avenue to reach the truth. So she is epistemically rational. Do you agree?
Now this is confusing to me as well. Let us forget about the extension for the moment and focus solely on the narrative as presented in the OP. I am not familiar how does value and rationality goes together, but, I think there is nothing wrong if her value is “Adam’s innocence” and that it is inherently valuable, and end to it self. Am my making any mistake in my train of thought?
By human-granularity, I mean beliefs about macro states that can be analyzed and manipulated by human thought and expressed in reasonable amounts (say, less than a few hundred pages of text) of human language. As contrasted with pure analytic beliefs about the state of the universe expressed numerically.
For instrumental rationality, what goals are furthered by her knowing the truth of this fact? Presuming that if Adam is innocent, she wants to believe that Adam is innocent and if Adam is guilty, she wants to believe Adam is guilty, why does she want to be correct (beyond “I like being right”)? What decision will she make based on it?
I think that’s it. “I like knowing that the person I love is innocent.” Which implies that Adam is not lying to her and “I like being in healthy, fulfilling and genuine marital relationship”
That’s a reason to want him to be innocent, not a reason to want to know the truth. What’s her motivation for the necessary second part of the litany: “if Adam is guilty, I want to believe that Adam is guilty”?
“If Adam is guilty, then the relationship was not genuine.” Am I on the right track? or did I misunderstood your question?
That just moves it up a level. If she is rational, she’ll say “if our relationship was genuine, I want to believe it was genuine. If our relationship was not genuine, I want to believe it was not genuine”.
The OP and most of the discussion has missed the fundamental premise of rationality: truth-seeking. The question is not “is Eve rational”, but “is Eve’s belief (including acknowledgement of uncertainty) correct”?