Rationalists are people who have an irrational preference for rationality.
This may sound silly, but when you think about it, it couldn’t be any other way. I am not saying that all reasons in favor of rationality are irrational—in fact, there are many rational reasons to be rational! It’s just that “rational reasons to be rational” is a circular argument that is not going to impress anyone who doesn’t already care about rationality for some other reason.
So when there is a debate like “but wouldn’t the right kind of self-deception be more instrumentally useful than perfectly calibrated rationality? do you care more about rationality or about winning?”, well… you can make good arguments for both sides...
On one hand, yes, if your goal is to maximize your utility function U then “maximizing U by any means necessary” is by definition ≥ “maximizing U using rationality”. On the other hand, if you take a step back, how would you know whether your approach X actually maximizes U, if you gave up on rationality? The self-deception that you chose instrumentally as a part of strategy X could as a side effect bias your estimates about how much U you really get by following X… but there may be ways to deflect this counter-argument.
1) Doublethink. Keep simultaneously two models of reality, one of them rational, the other optimized by the former to be winning. There are some shaky assumptions here. It may be computationally impossible for a human to keep two separate models of reality; to make sure that it’s the former that nudges the latter (rather than the other way round, or both nudging each other), but it’s the latter (rather than a mix of both) that influences System 1. But this sounds like a nirvana fallacy: the people who choose rationality over doublethink are not doing rationality perfectly either! So let’s compare the average human doublethink against the average human rationality (instead of a hypothetical perfect rationality). Now it is not so clear that the rationality wins.
2) Multiple agents. Imagine a father who wants his son to be winning as much as possible. The father could be a perfect rationalist, while raising his son to believe the optimal mix of rationality and self-serving bullshit. Here the objections against self-deception do not apply; the father is not deceiving himself about anything. (We could make a different objection, that the son will not be able to provide the same kind of service to his children. But that’s moving the goalpost.)
3) Split time. Become a perfect rationalist first, then design the perfect plan for brainwashing yourself into someone more winning (at the cost of losing some rationality), then brainwash yourself. Most of the objections you make against this idea can be answered by: yeah, assume that the original perfect rationalist considered this possibility and adjusted their plans accordingly. Yeah, in some Everett branches something completely unexpected might happen in exactly the right way that the original rationalist could have prevented a disaster, but the brainwashed person no longer can. But again, compare the average outcomes. The small probability of a disaster might be an acceptable cost to pay over a large probability of winning more.
Frankly, “if you are no longer a rationalist, you cannot be sure that you are doing the optimal thing” was never my true rejection. I am quite aware that I am not as rational as I could be, so I am not doing the optimal thing anyway. And I don’t even think that the outcome “you are doing the optimal thing, and you think that you are doing the optimal thing, but because you have some incorrect beliefs, you don’t have a justified true belief about doing the right thing” is somehow tragic; that sounds like something too abstract to care about, assuming that the optimal thing actually happens regardless.
My true rejection is more like this: I have an irrational preference for things like truth and reason (probably a side effect of mild autism). You provide an argument that is maybe correct or maybe incorrect, I am not really sure. From my perspective, what takes away the temptation is that your strategy requires that I give up a lot of what I actually care about, now, forever, with certainty… and in return maybe get some other value (possibly much greater) in some unspecified future, assuming that your reasoning is correct, and that I can execute your proposed strategy correctly. This simply does not sound like a good deal.
But the deal might be more balanced for someone who does not care about rationality. Then it’s just two strategies supported by similarly sounding, very abstract arguments. And you are going to make some mistakes no matter which one you choose, and in both cases an unlucky mistake might ruin everything. There is too much noise to make a solid argument for either side.
...which is why I consider “arguing that rationality is better than optimal self-deception” a waste of time; despite the fact that I made my choice and feel strongly about it. The arguments in favor of rationality are either circular (on the meta level), or irrational.
Trying to be rational for the wrong reasons
Rationalists are people who have an irrational preference for rationality.
This may sound silly, but when you think about it, it couldn’t be any other way. I am not saying that all reasons in favor of rationality are irrational—in fact, there are many rational reasons to be rational! It’s just that “rational reasons to be rational” is a circular argument that is not going to impress anyone who doesn’t already care about rationality for some other reason.
So when there is a debate like “but wouldn’t the right kind of self-deception be more instrumentally useful than perfectly calibrated rationality? do you care more about rationality or about winning?”, well… you can make good arguments for both sides...
On one hand, yes, if your goal is to maximize your utility function U then “maximizing U by any means necessary” is by definition ≥ “maximizing U using rationality”. On the other hand, if you take a step back, how would you know whether your approach X actually maximizes U, if you gave up on rationality? The self-deception that you chose instrumentally as a part of strategy X could as a side effect bias your estimates about how much U you really get by following X… but there may be ways to deflect this counter-argument.
1) Doublethink. Keep simultaneously two models of reality, one of them rational, the other optimized by the former to be winning. There are some shaky assumptions here. It may be computationally impossible for a human to keep two separate models of reality; to make sure that it’s the former that nudges the latter (rather than the other way round, or both nudging each other), but it’s the latter (rather than a mix of both) that influences System 1. But this sounds like a nirvana fallacy: the people who choose rationality over doublethink are not doing rationality perfectly either! So let’s compare the average human doublethink against the average human rationality (instead of a hypothetical perfect rationality). Now it is not so clear that the rationality wins.
2) Multiple agents. Imagine a father who wants his son to be winning as much as possible. The father could be a perfect rationalist, while raising his son to believe the optimal mix of rationality and self-serving bullshit. Here the objections against self-deception do not apply; the father is not deceiving himself about anything. (We could make a different objection, that the son will not be able to provide the same kind of service to his children. But that’s moving the goalpost.)
3) Split time. Become a perfect rationalist first, then design the perfect plan for brainwashing yourself into someone more winning (at the cost of losing some rationality), then brainwash yourself. Most of the objections you make against this idea can be answered by: yeah, assume that the original perfect rationalist considered this possibility and adjusted their plans accordingly. Yeah, in some Everett branches something completely unexpected might happen in exactly the right way that the original rationalist could have prevented a disaster, but the brainwashed person no longer can. But again, compare the average outcomes. The small probability of a disaster might be an acceptable cost to pay over a large probability of winning more.
Frankly, “if you are no longer a rationalist, you cannot be sure that you are doing the optimal thing” was never my true rejection. I am quite aware that I am not as rational as I could be, so I am not doing the optimal thing anyway. And I don’t even think that the outcome “you are doing the optimal thing, and you think that you are doing the optimal thing, but because you have some incorrect beliefs, you don’t have a justified true belief about doing the right thing” is somehow tragic; that sounds like something too abstract to care about, assuming that the optimal thing actually happens regardless.
My true rejection is more like this: I have an irrational preference for things like truth and reason (probably a side effect of mild autism). You provide an argument that is maybe correct or maybe incorrect, I am not really sure. From my perspective, what takes away the temptation is that your strategy requires that I give up a lot of what I actually care about, now, forever, with certainty… and in return maybe get some other value (possibly much greater) in some unspecified future, assuming that your reasoning is correct, and that I can execute your proposed strategy correctly. This simply does not sound like a good deal.
But the deal might be more balanced for someone who does not care about rationality. Then it’s just two strategies supported by similarly sounding, very abstract arguments. And you are going to make some mistakes no matter which one you choose, and in both cases an unlucky mistake might ruin everything. There is too much noise to make a solid argument for either side.
...which is why I consider “arguing that rationality is better than optimal self-deception” a waste of time; despite the fact that I made my choice and feel strongly about it. The arguments in favor of rationality are either circular (on the meta level), or irrational.