People are more often rational than otherwise when the rational answer happens to say good things about them.
Good things in this context is synonymous with good signals. Its easier to be rational when being rational or the conclusion that you reach make imply good things about you to other people.
2.
I hope people here agree that learning to be more rational will necessarily at least in some areas change your beliefs (unlikely any one person is right about everything).
People who learn rationality will likely change some of the beliefs that did not have a rational foundation before. And they are likely to (eventually) question all of them. Pre-rational beliefs are more likely than not things that are either neutral or send good things about you to your social group.
Putting 1. and 2. together make it seem that people who are rational might get into trouble by either opening debates about a subject (even if they eventually reach the “approved” conclusion) or by reaching a conclusion that would make him look bad in front of others.
Point 3. leads me to believe that the scenario of the previous paragraph would be true more often than I feel it is if I was more rational than I currently am, since I am to a certain extent selective at employing rationality, using it more often when it gives a convenient result.
If one is very rational one should also be better at ways to avoid sending bad signals, but point 4. seems very strong in people and may overwhelm this, hence 5. seems true.
A clear way around this at first glance seem to be the Dark Arts, but I don’t want to use them because some seem unethical, while others may cause bad habits that reduce my own rationality. Thus I decided to ask the community the bolded question.
What signals are you talking about?
Could you please rephrase this question? I’m afraid I’m not sure what exactly you’re asking about. I thought the link to signaling would clear up my usage of the words signaling and signal, but I assume you are familiar with that use.
“Acquiring the skills of rationality changes you. You will acquire new ways of assessing beliefs, and will forsake some old beliefs for new ones. This change may result in your fitting less well into the social niche that you occupied. This may be a disincentive to making such a change.”
Yes, this is a standard observation in all fields of personal development. The greatest resistance to change comes first from the person making that change, then from those around them, in order from the closest outwards. The only question to ask is, is it worth it? In the case of rationality, I think there is a very clear and simple answer: Yes.
I am minded to suggest some advice for rationality akin to Michael Pollan’s advice for diet. (“Eat food. Not too much. Mostly plants.”)
“Be rational. All the time. About everything.”
The Sequences are mostly about how to be rational, but the basic concept here is ultra-simple. Anything more is over-thinking it.
Except in as much as it amounts to discarding both “all the time” and “about everything” in all but the most esoteric technical sense. Being rational all the time about everything is a terrible idea when running on human hardware.
I still see this as nothing but a trite nitpick. What examples would you give where it is irrational to be rational? Where it’s smart to take stupid pills?
Seems to me that Richard is roughly talking about instrumental rationality, while Konkvistador is roughly talking about epistemic rationality. Let’s not quibble over the word rationality.
Voted up for the (in retrospect) good question.
1.
Good things in this context is synonymous with good signals. Its easier to be rational when being rational or the conclusion that you reach make imply good things about you to other people.
2.
People who learn rationality will likely change some of the beliefs that did not have a rational foundation before. And they are likely to (eventually) question all of them. Pre-rational beliefs are more likely than not things that are either neutral or send good things about you to your social group.
Putting 1. and 2. together make it seem that people who are rational might get into trouble by either opening debates about a subject (even if they eventually reach the “approved” conclusion) or by reaching a conclusion that would make him look bad in front of others.
Point 3. leads me to believe that the scenario of the previous paragraph would be true more often than I feel it is if I was more rational than I currently am, since I am to a certain extent selective at employing rationality, using it more often when it gives a convenient result.
If one is very rational one should also be better at ways to avoid sending bad signals, but point 4. seems very strong in people and may overwhelm this, hence 5. seems true.
A clear way around this at first glance seem to be the Dark Arts, but I don’t want to use them because some seem unethical, while others may cause bad habits that reduce my own rationality. Thus I decided to ask the community the bolded question.
Could you please rephrase this question? I’m afraid I’m not sure what exactly you’re asking about. I thought the link to signaling would clear up my usage of the words signaling and signal, but I assume you are familiar with that use.
Here is a summary of what I think you are saying:
“Acquiring the skills of rationality changes you. You will acquire new ways of assessing beliefs, and will forsake some old beliefs for new ones. This change may result in your fitting less well into the social niche that you occupied. This may be a disincentive to making such a change.”
Yes, this is a standard observation in all fields of personal development. The greatest resistance to change comes first from the person making that change, then from those around them, in order from the closest outwards. The only question to ask is, is it worth it? In the case of rationality, I think there is a very clear and simple answer: Yes.
I am minded to suggest some advice for rationality akin to Michael Pollan’s advice for diet. (“Eat food. Not too much. Mostly plants.”)
“Be rational. All the time. About everything.”
The Sequences are mostly about how to be rational, but the basic concept here is ultra-simple. Anything more is over-thinking it.
Be rational about everything, including optimal allocation of cognitive resources.
That’s just a minor detail of the how-to.
Except in as much as it amounts to discarding both “all the time” and “about everything” in all but the most esoteric technical sense. Being rational all the time about everything is a terrible idea when running on human hardware.
I still see this as nothing but a trite nitpick. What examples would you give where it is irrational to be rational? Where it’s smart to take stupid pills?
Sometimes thinking about a problem in all its depth costs you more than you would loose by forgoing to optimize it.
Then the smart thing to do is to not sweat over it.
Speaking of which, this conversation has become a case in point.
Seems to me that Richard is roughly talking about instrumental rationality, while Konkvistador is roughly talking about epistemic rationality. Let’s not quibble over the word rationality.