My main example is one where people (more than once, in fact) told me that “I might have my own truth, but other people have their truth as well”. This was incredibly easy to dismiss as people being unable to tell map from territory, but after the third time I started to wonder why people were telling me this. So I asked them what made them bring it up in the first place, and they replied that they felt uncomfortable when I was stating facts with the confidence they warranted. I was reminded of something Richard Dawkins said: “clarity is often seen as offensive.” I asked some other people if they felt the same way, and a helpful people-person told me that the reason for this is that those people felt threatened by my intelligence (they were HR) and my stating things with confidence reminded them of this. So I got the advice to phrase my statements of belief in a more friendly way. I hated this because it felt dishonest, having to use weasel words to hide the fact that I felt confident, but I could no longer deny that my current method wasn’t working.
The meta-level I learned was the one presented in the OP: When people give you advice/objections, they almost never say what they mean or what the actual problem is. They substitute something that sounds nice and not-offensive sounding, making it easy to dismiss their advice as nonsense. So what you are supposed to do is find out what they originally meant and draw a lesson from that instead.
Another example: My father often tells me not to be cynical, but this doesn’t make much sense to me because he is very cynical himself. It turns out that what he actually means is that I should be more upbeat, or as Scott Adams would put it: “Be a huge phony.” The reason my father does not state this outright is because he is following his own rule even while giving the advice: he is rephrasing “be a huge phony” as “don’t be cynical”, because “be a huge phony” sounds cynical.
I translate “I might have my own truth, but other people have their truth as well” as “You might have your perspective, but other people have their own perspectives. No one has the complete truth (territory), so don’t state your mere perspective as if it’s the complete truth.”
Another translation” “You may be certain you’re right, but the people you’re arguing with are just as certain that they are right.”
That is a perfectly valid interpretation, but it doesn’t explain why several people independently felt the need to explain this to me specifically, especially since it was worded in general terms and at the time I was just stating facts. This implied that there was something about me specifically that was bothering them.
Hence the lesson: Translate by finding out what made them give that advice in the first place, and only then rephrase it as good advice.
Less Wrongers and other rationalists frequently get told that “rationality is nice but emotion is important too”. Less Wrongers typically react to this by:
1) Mocking it as a fallacy because “rationality is defined as winning so it is not opposed to emotion”, before eagerly taking it up as a strawman and posting the erroneous argument all over the place to show everyone how poor the enemies of reason are at reasoning.
Instead of:
2) Actually considering for five minutes whether or not there might be a correlation or even an inverse causal relationship between rationality and emotional control/ability to read emotions, which causes this observation in the first place.
“Observations” are not always caused by people observing things.
The most well-known example of rationality associated with emotional control is Spock from Star Trek. And Spock is fictional. And fiction affects how people think about reality.
The point is that you don’t ignore countless people saying the same thing just because you can think of a reason to dismiss them. Even if you are right and that’s all it is, you’ll still have sinned for not considering it.
Otherwise clever people would always find excuses to justify their existing beliefs, and then where would we be?
The point is that you don’t ignore countless people saying the same thing just because you can think of a reason to dismiss them. Even if you are right and that’s all it is, you’ll still have sinned for not considering it.
Doesn’t the very fact that I have a reason imply that I must have considered it?
And at any rate, how is “They got their ideas about rationality from popular fiction” a failure to consider? Things are not always said by countless people because they have merit. And in this case, there’s a very well known, fairly obvious, reason why countless people would say such a thing. You may as well ask why countless people think that crashed cars explode.
My point was that you’re not supposed to stop thinking after finding a plausible explanation, and most certainly not after having found the singularly most convenient possible explanation. “Worst of all possible worlds” and all that.
If you feel this doesn’t apply to you, then please do not feel as though I’m addressing you specifically. It’s supposed to be advice for Less Wrong as a whole.
This premise sounds interesting, but I feel like concrete examples would really help me be sure I understand
Hm, okay, let me try to make it more concrete.
My main example is one where people (more than once, in fact) told me that “I might have my own truth, but other people have their truth as well”. This was incredibly easy to dismiss as people being unable to tell map from territory, but after the third time I started to wonder why people were telling me this. So I asked them what made them bring it up in the first place, and they replied that they felt uncomfortable when I was stating facts with the confidence they warranted. I was reminded of something Richard Dawkins said: “clarity is often seen as offensive.” I asked some other people if they felt the same way, and a helpful people-person told me that the reason for this is that those people felt threatened by my intelligence (they were HR) and my stating things with confidence reminded them of this. So I got the advice to phrase my statements of belief in a more friendly way. I hated this because it felt dishonest, having to use weasel words to hide the fact that I felt confident, but I could no longer deny that my current method wasn’t working.
The meta-level I learned was the one presented in the OP: When people give you advice/objections, they almost never say what they mean or what the actual problem is. They substitute something that sounds nice and not-offensive sounding, making it easy to dismiss their advice as nonsense. So what you are supposed to do is find out what they originally meant and draw a lesson from that instead.
Another example: My father often tells me not to be cynical, but this doesn’t make much sense to me because he is very cynical himself. It turns out that what he actually means is that I should be more upbeat, or as Scott Adams would put it: “Be a huge phony.” The reason my father does not state this outright is because he is following his own rule even while giving the advice: he is rephrasing “be a huge phony” as “don’t be cynical”, because “be a huge phony” sounds cynical.
I translate “I might have my own truth, but other people have their truth as well” as “You might have your perspective, but other people have their own perspectives. No one has the complete truth (territory), so don’t state your mere perspective as if it’s the complete truth.”
Another translation” “You may be certain you’re right, but the people you’re arguing with are just as certain that they are right.”
That is a perfectly valid interpretation, but it doesn’t explain why several people independently felt the need to explain this to me specifically, especially since it was worded in general terms and at the time I was just stating facts. This implied that there was something about me specifically that was bothering them.
Hence the lesson: Translate by finding out what made them give that advice in the first place, and only then rephrase it as good advice.
LOL :-)
Oh, I’ve thought of another example:
Less Wrongers and other rationalists frequently get told that “rationality is nice but emotion is important too”. Less Wrongers typically react to this by:
1) Mocking it as a fallacy because “rationality is defined as winning so it is not opposed to emotion”, before eagerly taking it up as a strawman and posting the erroneous argument all over the place to show everyone how poor the enemies of reason are at reasoning.
Instead of:
2) Actually considering for five minutes whether or not there might be a correlation or even an inverse causal relationship between rationality and emotional control/ability to read emotions, which causes this observation in the first place.
Needless to say, I blame Yudkowsky.
“Observations” are not always caused by people observing things.
The most well-known example of rationality associated with emotional control is Spock from Star Trek. And Spock is fictional. And fiction affects how people think about reality.
The point is that you don’t ignore countless people saying the same thing just because you can think of a reason to dismiss them. Even if you are right and that’s all it is, you’ll still have sinned for not considering it.
Otherwise clever people would always find excuses to justify their existing beliefs, and then where would we be?
Doesn’t the very fact that I have a reason imply that I must have considered it?
And at any rate, how is “They got their ideas about rationality from popular fiction” a failure to consider? Things are not always said by countless people because they have merit. And in this case, there’s a very well known, fairly obvious, reason why countless people would say such a thing. You may as well ask why countless people think that crashed cars explode.
My point was that you’re not supposed to stop thinking after finding a plausible explanation, and most certainly not after having found the singularly most convenient possible explanation. “Worst of all possible worlds” and all that.
If you feel this doesn’t apply to you, then please do not feel as though I’m addressing you specifically. It’s supposed to be advice for Less Wrong as a whole.