Less Wrongers and other rationalists frequently get told that “rationality is nice but emotion is important too”. Less Wrongers typically react to this by:
1) Mocking it as a fallacy because “rationality is defined as winning so it is not opposed to emotion”, before eagerly taking it up as a strawman and posting the erroneous argument all over the place to show everyone how poor the enemies of reason are at reasoning.
Instead of:
2) Actually considering for five minutes whether or not there might be a correlation or even an inverse causal relationship between rationality and emotional control/ability to read emotions, which causes this observation in the first place.
“Observations” are not always caused by people observing things.
The most well-known example of rationality associated with emotional control is Spock from Star Trek. And Spock is fictional. And fiction affects how people think about reality.
The point is that you don’t ignore countless people saying the same thing just because you can think of a reason to dismiss them. Even if you are right and that’s all it is, you’ll still have sinned for not considering it.
Otherwise clever people would always find excuses to justify their existing beliefs, and then where would we be?
The point is that you don’t ignore countless people saying the same thing just because you can think of a reason to dismiss them. Even if you are right and that’s all it is, you’ll still have sinned for not considering it.
Doesn’t the very fact that I have a reason imply that I must have considered it?
And at any rate, how is “They got their ideas about rationality from popular fiction” a failure to consider? Things are not always said by countless people because they have merit. And in this case, there’s a very well known, fairly obvious, reason why countless people would say such a thing. You may as well ask why countless people think that crashed cars explode.
My point was that you’re not supposed to stop thinking after finding a plausible explanation, and most certainly not after having found the singularly most convenient possible explanation. “Worst of all possible worlds” and all that.
If you feel this doesn’t apply to you, then please do not feel as though I’m addressing you specifically. It’s supposed to be advice for Less Wrong as a whole.
Oh, I’ve thought of another example:
Less Wrongers and other rationalists frequently get told that “rationality is nice but emotion is important too”. Less Wrongers typically react to this by:
1) Mocking it as a fallacy because “rationality is defined as winning so it is not opposed to emotion”, before eagerly taking it up as a strawman and posting the erroneous argument all over the place to show everyone how poor the enemies of reason are at reasoning.
Instead of:
2) Actually considering for five minutes whether or not there might be a correlation or even an inverse causal relationship between rationality and emotional control/ability to read emotions, which causes this observation in the first place.
Needless to say, I blame Yudkowsky.
“Observations” are not always caused by people observing things.
The most well-known example of rationality associated with emotional control is Spock from Star Trek. And Spock is fictional. And fiction affects how people think about reality.
The point is that you don’t ignore countless people saying the same thing just because you can think of a reason to dismiss them. Even if you are right and that’s all it is, you’ll still have sinned for not considering it.
Otherwise clever people would always find excuses to justify their existing beliefs, and then where would we be?
Doesn’t the very fact that I have a reason imply that I must have considered it?
And at any rate, how is “They got their ideas about rationality from popular fiction” a failure to consider? Things are not always said by countless people because they have merit. And in this case, there’s a very well known, fairly obvious, reason why countless people would say such a thing. You may as well ask why countless people think that crashed cars explode.
My point was that you’re not supposed to stop thinking after finding a plausible explanation, and most certainly not after having found the singularly most convenient possible explanation. “Worst of all possible worlds” and all that.
If you feel this doesn’t apply to you, then please do not feel as though I’m addressing you specifically. It’s supposed to be advice for Less Wrong as a whole.