At first, I thought this was going to be something Freudian, given that you used the word “emotional”.
The way you’re using it later on, though, I think it means how you’re thinking about yourself in regards to others? But there’s also something in the later examples you give, that seems to be about being arrogant / assuming you actually are smarter than most people?
I’ll run with the “thinking you actually are smarter than others” thing for now. In response to that line of inquiry:
I think that it’s probably true that most people here on LW actually are often the smartest person (along certain axes) in the room. That being said, there’s a good exercise (to run w/ your example of people giving unconvincing arguments (from your POV) for X) here where you genuinely try to ask yourself, “What’s the reason / mechanism that’s causing them to hold this belief so strongly?” And I think that sort of thing is good for improving your models of people. And you’ll probably learn stuff.
I also think that biasing towards thinking you’re wrong could be a good way of overcorrecting for the standard LW person. There’s a pretty clear failure mode here where we’re actually wrong about X, but also the people we meet aren’t giving us the best arguments in favor of not-X. So sometimes you need to be able to fill in that gap for yourself, or notice that “lack of good, available arguments for Y” doesn’t necessarily imply “Y isn’t a very tenable position.”
Lastly, there seems to be something good in the vein of point 1 and bridging inferential distances that’s about “translating” between ontologies. Like, clearly you have a worldview where they’re incorrect, and they have a worldview where they aren’t. Peeking into what the differences are could be interesting, as well as knowing what the strongest form of what they’re saying might look like within your worldview.
Request for terms clarification:
At first, I thought this was going to be something Freudian, given that you used the word “emotional”.
The way you’re using it later on, though, I think it means how you’re thinking about yourself in regards to others? But there’s also something in the later examples you give, that seems to be about being arrogant / assuming you actually are smarter than most people?
I’ll run with the “thinking you actually are smarter than others” thing for now. In response to that line of inquiry:
I think that it’s probably true that most people here on LW actually are often the smartest person (along certain axes) in the room. That being said, there’s a good exercise (to run w/ your example of people giving unconvincing arguments (from your POV) for X) here where you genuinely try to ask yourself, “What’s the reason / mechanism that’s causing them to hold this belief so strongly?” And I think that sort of thing is good for improving your models of people. And you’ll probably learn stuff.
I also think that biasing towards thinking you’re wrong could be a good way of overcorrecting for the standard LW person. There’s a pretty clear failure mode here where we’re actually wrong about X, but also the people we meet aren’t giving us the best arguments in favor of not-X. So sometimes you need to be able to fill in that gap for yourself, or notice that “lack of good, available arguments for Y” doesn’t necessarily imply “Y isn’t a very tenable position.”
Lastly, there seems to be something good in the vein of point 1 and bridging inferential distances that’s about “translating” between ontologies. Like, clearly you have a worldview where they’re incorrect, and they have a worldview where they aren’t. Peeking into what the differences are could be interesting, as well as knowing what the strongest form of what they’re saying might look like within your worldview.