What if we were to take one step back and Adam didn’t die. Eve claims that her believe pays rent because it could be falsified if Adam changed in character. In this scenario, I suppose that you would agree to say that Eve is still rational.
Now, I cannot formulate my arguments properly at the moment, but I think it is weird that Adam’s death make Eve’s belief irrational, as per:
So I do not believe a spaceship blips out of existence when it crosses the cosmological horizon of our expanding universe, even though the spaceship’s existence has no further experimental consequences for me.
I think you’re focusing too much on the label “rational”, and not enough on the actual effect of beliefs.
I’ll admit I’m closer to logical positivism than is Eliezer, but even if you make the argument (which you haven’t) that the model of the universe is simpler (in the Kolmogorov complexity sense) by believing Adam killed Able, it’s still not important. Unless you’re making predictions and taking actions based on a belief (or on beliefs influenced by that belief), it’s neither rational nor irrational, it’s irrelevant.
Now, a somewhat more complicated example, where Eve has to judge Cain’s likelihood of murdering her, and thinks the circumstances of the locked room in the past are relevant to her future, there are definite predictions she should be making. Her confidence in Adam’s innocence implies Cain’s guilt, and she should be concerned.
It’s still the case that she cannot possibly have enough evidence for her confidence to be 1.00.
Thank you, that was a very nice extension to the story. I should have included the scenario to make her belief relevant. I agree with you, assigning 100% probability is irrational in her case. But, if she is not rationally literate enough to express herself in fuzzy, non-binary way, I think she would maintain rationality through saying “Ceteris paribus, I prefer to be not locked in the same room with Cain because I believe he is a murder because I believe Adam was innocent” (ignoring ad hominem)
I was under the impression that the golden standard for rationality is falsifiability. However, I now understand that Eve is rational despite unfalsifiablity, because she remained Bayesian.
I’m still deeply troubled by the focus on labels “rational” and now “Bayesian”, rather than “winning”, “predicting”, or “correct”.
For epistemic rationality, focus on truth rather than rationality: do these beliefs map to actual contingent states of the universe? Especially for human-granularity beliefs, Bayesian reasoning is really difficult, because it’s unlikely for you to know your priors in any precise way.
For instrumental rationality, focus on decisions: are the actions I’m taking based on these beliefs likely to improve my future experiences?
I don’t understand what does it mean, even after a google search, so please enlighten me.
For epistemic rationality
I think so. I think she has exhausted all the possible avenue to reach the truth. So she is epistemically rational. Do you agree?
For instrumental rationality
Now this is confusing to me as well. Let us forget about the extension for the moment and focus solely on the narrative as presented in the OP. I am not familiar how does value and rationality goes together, but, I think there is nothing wrong if her value is “Adam’s innocence” and that it is inherently valuable, and end to it self. Am my making any mistake in my train of thought?
By human-granularity, I mean beliefs about macro states that can be analyzed and manipulated by human thought and expressed in reasonable amounts (say, less than a few hundred pages of text) of human language. As contrasted with pure analytic beliefs about the state of the universe expressed numerically.
For instrumental rationality, what goals are furthered by her knowing the truth of this fact? Presuming that if Adam is innocent, she wants to believe that Adam is innocent and if Adam is guilty, she wants to believe Adam is guilty, why does she want to be correct (beyond “I like being right”)? What decision will she make based on it?
why does she want to be correct (beyond “I like being right”)?
I think that’s it. “I like knowing that the person I love is innocent.” Which implies that Adam is not lying to her and “I like being in healthy, fulfilling and genuine marital relationship”
That’s a reason to want him to be innocent, not a reason to want to know the truth. What’s her motivation for the necessary second part of the litany: “if Adam is guilty, I want to believe that Adam is guilty”?
That just moves it up a level. If she is rational, she’ll say “if our relationship was genuine, I want to believe it was genuine. If our relationship was not genuine, I want to believe it was not genuine”.
The OP and most of the discussion has missed the fundamental premise of rationality: truth-seeking. The question is not “is Eve rational”, but “is Eve’s belief (including acknowledgement of uncertainty) correct”?
What if we were to take one step back and Adam didn’t die. Eve claims that her believe pays rent because it could be falsified if Adam changed in character. In this scenario, I suppose that you would agree to say that Eve is still rational.
Now, I cannot formulate my arguments properly at the moment, but I think it is weird that Adam’s death make Eve’s belief irrational, as per:
http://lesswrong.com/lw/ss/no_logical_positivist_i/
I think you’re focusing too much on the label “rational”, and not enough on the actual effect of beliefs.
I’ll admit I’m closer to logical positivism than is Eliezer, but even if you make the argument (which you haven’t) that the model of the universe is simpler (in the Kolmogorov complexity sense) by believing Adam killed Able, it’s still not important. Unless you’re making predictions and taking actions based on a belief (or on beliefs influenced by that belief), it’s neither rational nor irrational, it’s irrelevant.
Now, a somewhat more complicated example, where Eve has to judge Cain’s likelihood of murdering her, and thinks the circumstances of the locked room in the past are relevant to her future, there are definite predictions she should be making. Her confidence in Adam’s innocence implies Cain’s guilt, and she should be concerned.
It’s still the case that she cannot possibly have enough evidence for her confidence to be 1.00.
Thank you, that was a very nice extension to the story. I should have included the scenario to make her belief relevant. I agree with you, assigning 100% probability is irrational in her case. But, if she is not rationally literate enough to express herself in fuzzy, non-binary way, I think she would maintain rationality through saying “Ceteris paribus, I prefer to be not locked in the same room with Cain because I believe he is a murder because I believe Adam was innocent” (ignoring ad hominem)
I was under the impression that the golden standard for rationality is falsifiability. However, I now understand that Eve is rational despite unfalsifiablity, because she remained Bayesian.
I’m still deeply troubled by the focus on labels “rational” and now “Bayesian”, rather than “winning”, “predicting”, or “correct”.
For epistemic rationality, focus on truth rather than rationality: do these beliefs map to actual contingent states of the universe? Especially for human-granularity beliefs, Bayesian reasoning is really difficult, because it’s unlikely for you to know your priors in any precise way.
For instrumental rationality, focus on decisions: are the actions I’m taking based on these beliefs likely to improve my future experiences?
I don’t understand what does it mean, even after a google search, so please enlighten me.
I think so. I think she has exhausted all the possible avenue to reach the truth. So she is epistemically rational. Do you agree?
Now this is confusing to me as well. Let us forget about the extension for the moment and focus solely on the narrative as presented in the OP. I am not familiar how does value and rationality goes together, but, I think there is nothing wrong if her value is “Adam’s innocence” and that it is inherently valuable, and end to it self. Am my making any mistake in my train of thought?
By human-granularity, I mean beliefs about macro states that can be analyzed and manipulated by human thought and expressed in reasonable amounts (say, less than a few hundred pages of text) of human language. As contrasted with pure analytic beliefs about the state of the universe expressed numerically.
For instrumental rationality, what goals are furthered by her knowing the truth of this fact? Presuming that if Adam is innocent, she wants to believe that Adam is innocent and if Adam is guilty, she wants to believe Adam is guilty, why does she want to be correct (beyond “I like being right”)? What decision will she make based on it?
I think that’s it. “I like knowing that the person I love is innocent.” Which implies that Adam is not lying to her and “I like being in healthy, fulfilling and genuine marital relationship”
That’s a reason to want him to be innocent, not a reason to want to know the truth. What’s her motivation for the necessary second part of the litany: “if Adam is guilty, I want to believe that Adam is guilty”?
“If Adam is guilty, then the relationship was not genuine.” Am I on the right track? or did I misunderstood your question?
That just moves it up a level. If she is rational, she’ll say “if our relationship was genuine, I want to believe it was genuine. If our relationship was not genuine, I want to believe it was not genuine”.
The OP and most of the discussion has missed the fundamental premise of rationality: truth-seeking. The question is not “is Eve rational”, but “is Eve’s belief (including acknowledgement of uncertainty) correct”?