I always made a distinction between rationality and truth-seeking. Rationality is only intelligible when in the context of a goal (whether that goal be rational or irrational). Now, if one acts rationally, given their information set, will chose the best plan-of-action towards succeeding their goal. Part of being rational is knowing which goals will maximize their utility function.
My definition of truth-seeking is basically Robin’s definition of “rational.” I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?
Well, sure. Repeating other posts—but one of the most common examples is when an agent’s beliefs are displayed to other agents. Imagine that all your associates think that there is a Christian god. This group includes all your prospective friends and mates. Do you tell them you are an agnostic/atheist—and that their views are not supported by the evidence? No, of course not! However, you had better not lie to them either—since most humans lie so poorly. The best thing to do is probably to believe their nonsense yourself.
Tim, that’s an excellent argument for why rationality isn’t always the winning strategy in real life. People have been saying this sort of thing all week, but it was your “most humans lie so poorly” comment that really made it click for me, especially in the context of evolutionary psychology.
I’d really like to hear one of the “rationalists should always win” people address this objection.
We’re talking about at least two different notions of the word “rational”:
Robin Hanson used the definition at the top of this post, regarding believing the truth. There are social/evolutionary costs to that, partly because humans lie poorly.
The causal decision theorists’ definition that Eliezer Yudkowsky was annoyed by. CDT defines rationality to be a specific method of deciding what action to take, even though this leads to two-boxing (losing) Newcomb’s problem. Yudkowsky’s objection, summarized by the slogan “Rationalists should WIN.” was NOT a definition. It is a quality of his informal concept of rationality which the CDT definition failed to capture.
The claim “rationalists should always win” comes from taking Yudkowsky’s slogan as a definition of rationality. If that is the definition that you are using, then the claim is tautological.
Please note that I don’t endorse this misreading of Yudkowsky’s post, I’m just trying to answer your question.
As you say, defining rationality as winning and then saying rationalists always win is a tautology. But aside from your two definitions, there’s a third definition: the common definition of rationality as basing decisions on evidence, Bayes, and logic. So as I see it, supporters of “rationalists always win” need to do one of the following:
Show that the winning definition is the same as the Bayes/logic/evidence definition. Tim’s counterexample of the religious believer who’s a poor liar makes me doubt this is possible.
Stop using “rationality” to refer to things like the Twelve Virtues and Bayesian techniques, since these virtues and techniques sometimes lose and are therefore not always rational.
Abandon “rationalists always win” in favor of Robin’s “rationalists always seek the truth”. I think that definition is sufficient to demonstrate that a rationalist should one-box on Newcombe’s problem anyway. After all, if it’s true that one boxing is the better result, a seeker of truth should realize that and decide to one-box.
There are no supporters of “rationalists always win” – the slogan is “rationalists should win”. Long-term / on-average, it’s rational to expect a high correlation between rationality and success.
[1] – I’d bet that the rationalist strategy fares well against other heuristics; let’s devise a good test. There may always be an effective upper-bound to the returns to increasing rationality in any community, but reality is dangerous – I’d expect rationalists to fair better.
[2] – Winning or losing one ‘round’ isn’t sufficient grounds to declare a strategy, or particular decisions, as being non-rational. Buying lottery tickets isn’t rational because some people win. And sometimes, winning isn’t possible.
[3] – I like “rationalists always seek the truth” but would add ”… but they don’t seek all truths.”
Indeed—religion is persistent. Of course in the real world you would find that isolated communities would arise, where “belief mutations” could arise without them being severely punished by the crowd.
Interesting, if rationality corresponds to winning, and Christianity is persistent, then we should give up on trying to eliminate Christianity. Not merely because it is a waste of resources, but also because their belief in God is not directly tied to winning and losing. Some beliefs lead to winning (philanthropy, community) and some beliefs lead to losing (insert any one of many here). We should focus energies on discouraging the losing beliefs with whatever means at our disposal, including humoring their belief in God in specific arguments. (For example, we could try and convince a bible literalist that God would forgive them for believing evolution because he deliberately gave us convincing evidence of it.) -- learning as I go, I just learned such arguments are called “Pragmatism”.
I will likely delete this post now that it has been down-voted. I wrote it as a natural response to the information I read and am not attached to it. Before deleting, I’m curious if I can solicit feedback from the person who down-voted me. Because the post was boring?
Pwno said: I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?
The classic example would invoke the placebo effect. Believing that medical care is likely to be successful can actually make it more successful; believing that it is likely to fail might vitiate the placebo effect. So, if you are taking a treatment with the goal of getting better, and that treatment is not very good (but it is the best available option), then it is better from a rationalist goal-seeking perspective to have an incorrectly high assessment of the treatment’s possibility of success.
This generalizes more broadly to other areas of life where confidence is key. When dating, or going to a job interview, confidence can sometimes make the difference between success and failure. So it can pay, in such scenarios, to be wrong (so long as you are wrong in the right way).
It turns out that we are, in fact, generally optimized to make precisely this mistake. Far more people think they are above average in most domains than hold the opposite view. Likewise, people regularly place a high degree of trust in treatments with a very low probability of success, and we have many social mechanisms that try and encourage such behavior. It might be “irrational” under your usage to try and help these people form more accurate beliefs.
I like to distinguish information-theoretic rationality from decision-theoretic rationality. (But these are rather long terms.) Often on this blog it’s unclear which is meant (although you and Robin did make it clear.)
I always made a distinction between rationality and truth-seeking. Rationality is only intelligible when in the context of a goal (whether that goal be rational or irrational). Now, if one acts rationally, given their information set, will chose the best plan-of-action towards succeeding their goal. Part of being rational is knowing which goals will maximize their utility function.
My definition of truth-seeking is basically Robin’s definition of “rational.” I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?
Well, sure. Repeating other posts—but one of the most common examples is when an agent’s beliefs are displayed to other agents. Imagine that all your associates think that there is a Christian god. This group includes all your prospective friends and mates. Do you tell them you are an agnostic/atheist—and that their views are not supported by the evidence? No, of course not! However, you had better not lie to them either—since most humans lie so poorly. The best thing to do is probably to believe their nonsense yourself.
Tim, that’s an excellent argument for why rationality isn’t always the winning strategy in real life. People have been saying this sort of thing all week, but it was your “most humans lie so poorly” comment that really made it click for me, especially in the context of evolutionary psychology.
I’d really like to hear one of the “rationalists should always win” people address this objection.
We’re talking about at least two different notions of the word “rational”:
Robin Hanson used the definition at the top of this post, regarding believing the truth. There are social/evolutionary costs to that, partly because humans lie poorly.
The causal decision theorists’ definition that Eliezer Yudkowsky was annoyed by. CDT defines rationality to be a specific method of deciding what action to take, even though this leads to two-boxing (losing) Newcomb’s problem. Yudkowsky’s objection, summarized by the slogan “Rationalists should WIN.” was NOT a definition. It is a quality of his informal concept of rationality which the CDT definition failed to capture.
The claim “rationalists should always win” comes from taking Yudkowsky’s slogan as a definition of rationality. If that is the definition that you are using, then the claim is tautological.
Please note that I don’t endorse this misreading of Yudkowsky’s post, I’m just trying to answer your question.
Thanks, John.
As you say, defining rationality as winning and then saying rationalists always win is a tautology. But aside from your two definitions, there’s a third definition: the common definition of rationality as basing decisions on evidence, Bayes, and logic. So as I see it, supporters of “rationalists always win” need to do one of the following:
Show that the winning definition is the same as the Bayes/logic/evidence definition. Tim’s counterexample of the religious believer who’s a poor liar makes me doubt this is possible.
Stop using “rationality” to refer to things like the Twelve Virtues and Bayesian techniques, since these virtues and techniques sometimes lose and are therefore not always rational.
Abandon “rationalists always win” in favor of Robin’s “rationalists always seek the truth”. I think that definition is sufficient to demonstrate that a rationalist should one-box on Newcombe’s problem anyway. After all, if it’s true that one boxing is the better result, a seeker of truth should realize that and decide to one-box.
There are no supporters of “rationalists always win” – the slogan is “rationalists should win”. Long-term / on-average, it’s rational to expect a high correlation between rationality and success.
[1] – I’d bet that the rationalist strategy fares well against other heuristics; let’s devise a good test. There may always be an effective upper-bound to the returns to increasing rationality in any community, but reality is dangerous – I’d expect rationalists to fair better.
[2] – Winning or losing one ‘round’ isn’t sufficient grounds to declare a strategy, or particular decisions, as being non-rational. Buying lottery tickets isn’t rational because some people win. And sometimes, winning isn’t possible.
[3] – I like “rationalists always seek the truth” but would add ”… but they don’t seek all truths.”
You realize, of course, that under this policy everyone stays Christian forever.
Indeed—religion is persistent. Of course in the real world you would find that isolated communities would arise, where “belief mutations” could arise without them being severely punished by the crowd.
Interesting, if rationality corresponds to winning, and Christianity is persistent, then we should give up on trying to eliminate Christianity. Not merely because it is a waste of resources, but also because their belief in God is not directly tied to winning and losing. Some beliefs lead to winning (philanthropy, community) and some beliefs lead to losing (insert any one of many here). We should focus energies on discouraging the losing beliefs with whatever means at our disposal, including humoring their belief in God in specific arguments. (For example, we could try and convince a bible literalist that God would forgive them for believing evolution because he deliberately gave us convincing evidence of it.) -- learning as I go, I just learned such arguments are called “Pragmatism”.
I will likely delete this post now that it has been down-voted. I wrote it as a natural response to the information I read and am not attached to it. Before deleting, I’m curious if I can solicit feedback from the person who down-voted me. Because the post was boring?
Pwno said: I find it hard to imagine a time where truth-seeking is incompatible with acting rationally (the way I defined it). Can anyone think of an example?
The classic example would invoke the placebo effect. Believing that medical care is likely to be successful can actually make it more successful; believing that it is likely to fail might vitiate the placebo effect. So, if you are taking a treatment with the goal of getting better, and that treatment is not very good (but it is the best available option), then it is better from a rationalist goal-seeking perspective to have an incorrectly high assessment of the treatment’s possibility of success.
This generalizes more broadly to other areas of life where confidence is key. When dating, or going to a job interview, confidence can sometimes make the difference between success and failure. So it can pay, in such scenarios, to be wrong (so long as you are wrong in the right way).
It turns out that we are, in fact, generally optimized to make precisely this mistake. Far more people think they are above average in most domains than hold the opposite view. Likewise, people regularly place a high degree of trust in treatments with a very low probability of success, and we have many social mechanisms that try and encourage such behavior. It might be “irrational” under your usage to try and help these people form more accurate beliefs.
I like to distinguish information-theoretic rationality from decision-theoretic rationality. (But these are rather long terms.) Often on this blog it’s unclear which is meant (although you and Robin did make it clear.)
The relevant articles: What do we mean by rationality wiki
Yeah, I’d just been reading those, but they don’t fix the terminology either.
Perhaps you could call them “truth” and “winning” respectively.