Well, sure. Repeating other posts—but one of the most common examples is when an agent’s beliefs are displayed to other agents. Imagine that all your associates think that there is a Christian god. This group includes all your prospective friends and mates. Do you tell them you are an agnostic/atheist—and that their views are not supported by the evidence? No, of course not! However, you had better not lie to them either—since most humans lie so poorly. The best thing to do is probably to believe their nonsense yourself.
Tim, that’s an excellent argument for why rationality isn’t always the winning strategy in real life. People have been saying this sort of thing all week, but it was your “most humans lie so poorly” comment that really made it click for me, especially in the context of evolutionary psychology.
I’d really like to hear one of the “rationalists should always win” people address this objection.
We’re talking about at least two different notions of the word “rational”:
Robin Hanson used the definition at the top of this post, regarding believing the truth. There are social/evolutionary costs to that, partly because humans lie poorly.
The causal decision theorists’ definition that Eliezer Yudkowsky was annoyed by. CDT defines rationality to be a specific method of deciding what action to take, even though this leads to two-boxing (losing) Newcomb’s problem. Yudkowsky’s objection, summarized by the slogan “Rationalists should WIN.” was NOT a definition. It is a quality of his informal concept of rationality which the CDT definition failed to capture.
The claim “rationalists should always win” comes from taking Yudkowsky’s slogan as a definition of rationality. If that is the definition that you are using, then the claim is tautological.
Please note that I don’t endorse this misreading of Yudkowsky’s post, I’m just trying to answer your question.
As you say, defining rationality as winning and then saying rationalists always win is a tautology. But aside from your two definitions, there’s a third definition: the common definition of rationality as basing decisions on evidence, Bayes, and logic. So as I see it, supporters of “rationalists always win” need to do one of the following:
Show that the winning definition is the same as the Bayes/logic/evidence definition. Tim’s counterexample of the religious believer who’s a poor liar makes me doubt this is possible.
Stop using “rationality” to refer to things like the Twelve Virtues and Bayesian techniques, since these virtues and techniques sometimes lose and are therefore not always rational.
Abandon “rationalists always win” in favor of Robin’s “rationalists always seek the truth”. I think that definition is sufficient to demonstrate that a rationalist should one-box on Newcombe’s problem anyway. After all, if it’s true that one boxing is the better result, a seeker of truth should realize that and decide to one-box.
There are no supporters of “rationalists always win” – the slogan is “rationalists should win”. Long-term / on-average, it’s rational to expect a high correlation between rationality and success.
[1] – I’d bet that the rationalist strategy fares well against other heuristics; let’s devise a good test. There may always be an effective upper-bound to the returns to increasing rationality in any community, but reality is dangerous – I’d expect rationalists to fair better.
[2] – Winning or losing one ‘round’ isn’t sufficient grounds to declare a strategy, or particular decisions, as being non-rational. Buying lottery tickets isn’t rational because some people win. And sometimes, winning isn’t possible.
[3] – I like “rationalists always seek the truth” but would add ”… but they don’t seek all truths.”
Indeed—religion is persistent. Of course in the real world you would find that isolated communities would arise, where “belief mutations” could arise without them being severely punished by the crowd.
Interesting, if rationality corresponds to winning, and Christianity is persistent, then we should give up on trying to eliminate Christianity. Not merely because it is a waste of resources, but also because their belief in God is not directly tied to winning and losing. Some beliefs lead to winning (philanthropy, community) and some beliefs lead to losing (insert any one of many here). We should focus energies on discouraging the losing beliefs with whatever means at our disposal, including humoring their belief in God in specific arguments. (For example, we could try and convince a bible literalist that God would forgive them for believing evolution because he deliberately gave us convincing evidence of it.) -- learning as I go, I just learned such arguments are called “Pragmatism”.
I will likely delete this post now that it has been down-voted. I wrote it as a natural response to the information I read and am not attached to it. Before deleting, I’m curious if I can solicit feedback from the person who down-voted me. Because the post was boring?
Well, sure. Repeating other posts—but one of the most common examples is when an agent’s beliefs are displayed to other agents. Imagine that all your associates think that there is a Christian god. This group includes all your prospective friends and mates. Do you tell them you are an agnostic/atheist—and that their views are not supported by the evidence? No, of course not! However, you had better not lie to them either—since most humans lie so poorly. The best thing to do is probably to believe their nonsense yourself.
Tim, that’s an excellent argument for why rationality isn’t always the winning strategy in real life. People have been saying this sort of thing all week, but it was your “most humans lie so poorly” comment that really made it click for me, especially in the context of evolutionary psychology.
I’d really like to hear one of the “rationalists should always win” people address this objection.
We’re talking about at least two different notions of the word “rational”:
Robin Hanson used the definition at the top of this post, regarding believing the truth. There are social/evolutionary costs to that, partly because humans lie poorly.
The causal decision theorists’ definition that Eliezer Yudkowsky was annoyed by. CDT defines rationality to be a specific method of deciding what action to take, even though this leads to two-boxing (losing) Newcomb’s problem. Yudkowsky’s objection, summarized by the slogan “Rationalists should WIN.” was NOT a definition. It is a quality of his informal concept of rationality which the CDT definition failed to capture.
The claim “rationalists should always win” comes from taking Yudkowsky’s slogan as a definition of rationality. If that is the definition that you are using, then the claim is tautological.
Please note that I don’t endorse this misreading of Yudkowsky’s post, I’m just trying to answer your question.
Thanks, John.
As you say, defining rationality as winning and then saying rationalists always win is a tautology. But aside from your two definitions, there’s a third definition: the common definition of rationality as basing decisions on evidence, Bayes, and logic. So as I see it, supporters of “rationalists always win” need to do one of the following:
Show that the winning definition is the same as the Bayes/logic/evidence definition. Tim’s counterexample of the religious believer who’s a poor liar makes me doubt this is possible.
Stop using “rationality” to refer to things like the Twelve Virtues and Bayesian techniques, since these virtues and techniques sometimes lose and are therefore not always rational.
Abandon “rationalists always win” in favor of Robin’s “rationalists always seek the truth”. I think that definition is sufficient to demonstrate that a rationalist should one-box on Newcombe’s problem anyway. After all, if it’s true that one boxing is the better result, a seeker of truth should realize that and decide to one-box.
There are no supporters of “rationalists always win” – the slogan is “rationalists should win”. Long-term / on-average, it’s rational to expect a high correlation between rationality and success.
[1] – I’d bet that the rationalist strategy fares well against other heuristics; let’s devise a good test. There may always be an effective upper-bound to the returns to increasing rationality in any community, but reality is dangerous – I’d expect rationalists to fair better.
[2] – Winning or losing one ‘round’ isn’t sufficient grounds to declare a strategy, or particular decisions, as being non-rational. Buying lottery tickets isn’t rational because some people win. And sometimes, winning isn’t possible.
[3] – I like “rationalists always seek the truth” but would add ”… but they don’t seek all truths.”
You realize, of course, that under this policy everyone stays Christian forever.
Indeed—religion is persistent. Of course in the real world you would find that isolated communities would arise, where “belief mutations” could arise without them being severely punished by the crowd.
Interesting, if rationality corresponds to winning, and Christianity is persistent, then we should give up on trying to eliminate Christianity. Not merely because it is a waste of resources, but also because their belief in God is not directly tied to winning and losing. Some beliefs lead to winning (philanthropy, community) and some beliefs lead to losing (insert any one of many here). We should focus energies on discouraging the losing beliefs with whatever means at our disposal, including humoring their belief in God in specific arguments. (For example, we could try and convince a bible literalist that God would forgive them for believing evolution because he deliberately gave us convincing evidence of it.) -- learning as I go, I just learned such arguments are called “Pragmatism”.
I will likely delete this post now that it has been down-voted. I wrote it as a natural response to the information I read and am not attached to it. Before deleting, I’m curious if I can solicit feedback from the person who down-voted me. Because the post was boring?