For conversations where you think you just know better, I recommend active listening. No, I don’t mean the shallow thing of becoming a conversational parrot. You try to put what you said into your own words, to check if you understood their claim. This does several good things to the conversation. One, it often reveals mistakes in your interpretation of them. This helps avoid pointless rabbit-holes in the discussion based on misunderstandings. Two, it lets them know you are trying to understand so that when you make your objections they are likely to respond better. Three, taking a conversational tone which is open to their view helps you to actually be open to their view.
I also recommend taking an attitude of playing around with ideas. Keep an eye on what you believe, but take an interest in the alternatives, even the ones which sound crazy. That way, you can notice when the “crazy” ones seem to line up better with reality.
I’d second the active listening and check for clarification idea. If I’m at the point where I’m fairly certain that this person doesn’t know what they are talking about, I stop putting effort into arguing and just see if I can learn anything about how they came to this point of view.
At first, I thought this was going to be something Freudian, given that you used the word “emotional”.
The way you’re using it later on, though, I think it means how you’re thinking about yourself in regards to others? But there’s also something in the later examples you give, that seems to be about being arrogant / assuming you actually are smarter than most people?
I’ll run with the “thinking you actually are smarter than others” thing for now. In response to that line of inquiry:
I think that it’s probably true that most people here on LW actually are often the smartest person (along certain axes) in the room. That being said, there’s a good exercise (to run w/ your example of people giving unconvincing arguments (from your POV) for X) here where you genuinely try to ask yourself, “What’s the reason / mechanism that’s causing them to hold this belief so strongly?” And I think that sort of thing is good for improving your models of people. And you’ll probably learn stuff.
I also think that biasing towards thinking you’re wrong could be a good way of overcorrecting for the standard LW person. There’s a pretty clear failure mode here where we’re actually wrong about X, but also the people we meet aren’t giving us the best arguments in favor of not-X. So sometimes you need to be able to fill in that gap for yourself, or notice that “lack of good, available arguments for Y” doesn’t necessarily imply “Y isn’t a very tenable position.”
Lastly, there seems to be something good in the vein of point 1 and bridging inferential distances that’s about “translating” between ontologies. Like, clearly you have a worldview where they’re incorrect, and they have a worldview where they aren’t. Peeking into what the differences are could be interesting, as well as knowing what the strongest form of what they’re saying might look like within your worldview.
I have a way of talking with people where when someone says something I disagree with, and I sense they have an ego, I don’t outright disagree with them. Instead I start asking them a lot of questions about their stance, getting them to flush out what it is that they actually believe. This is much easier to do when you are still “on the same side”, as you haven’t clashed with their ego by telling them they’re wrong. From my experience, less biases are triggered when one is explaining something to someone who they think is curious, vs someone who they think is “out to get them”.
I think there’s something egotistical about thinking your operating at a different intellectual level. On the other hand, that doesn’t make it false.
But I think you’re talking about rhetoric. Deep down inside I think we know we’re different intellectually (and that doesn’t need to mean smarter, it could just be insight via anti-socialness). But arguing that way isn’t very productive. One thing I’ve learned about selling is to get them to like you before they’ll buy from you. They don’t like you if you make them feel dumb, and in an insecure world it isn’t hard to make people feel dumb without trying to.
For conversations where you think you just know better, I recommend active listening. No, I don’t mean the shallow thing of becoming a conversational parrot. You try to put what you said into your own words, to check if you understood their claim. This does several good things to the conversation. One, it often reveals mistakes in your interpretation of them. This helps avoid pointless rabbit-holes in the discussion based on misunderstandings. Two, it lets them know you are trying to understand so that when you make your objections they are likely to respond better. Three, taking a conversational tone which is open to their view helps you to actually be open to their view.
I also recommend taking an attitude of playing around with ideas. Keep an eye on what you believe, but take an interest in the alternatives, even the ones which sound crazy. That way, you can notice when the “crazy” ones seem to line up better with reality.
I’d second the active listening and check for clarification idea. If I’m at the point where I’m fairly certain that this person doesn’t know what they are talking about, I stop putting effort into arguing and just see if I can learn anything about how they came to this point of view.
Request for terms clarification:
At first, I thought this was going to be something Freudian, given that you used the word “emotional”.
The way you’re using it later on, though, I think it means how you’re thinking about yourself in regards to others? But there’s also something in the later examples you give, that seems to be about being arrogant / assuming you actually are smarter than most people?
I’ll run with the “thinking you actually are smarter than others” thing for now. In response to that line of inquiry:
I think that it’s probably true that most people here on LW actually are often the smartest person (along certain axes) in the room. That being said, there’s a good exercise (to run w/ your example of people giving unconvincing arguments (from your POV) for X) here where you genuinely try to ask yourself, “What’s the reason / mechanism that’s causing them to hold this belief so strongly?” And I think that sort of thing is good for improving your models of people. And you’ll probably learn stuff.
I also think that biasing towards thinking you’re wrong could be a good way of overcorrecting for the standard LW person. There’s a pretty clear failure mode here where we’re actually wrong about X, but also the people we meet aren’t giving us the best arguments in favor of not-X. So sometimes you need to be able to fill in that gap for yourself, or notice that “lack of good, available arguments for Y” doesn’t necessarily imply “Y isn’t a very tenable position.”
Lastly, there seems to be something good in the vein of point 1 and bridging inferential distances that’s about “translating” between ontologies. Like, clearly you have a worldview where they’re incorrect, and they have a worldview where they aren’t. Peeking into what the differences are could be interesting, as well as knowing what the strongest form of what they’re saying might look like within your worldview.
I have a way of talking with people where when someone says something I disagree with, and I sense they have an ego, I don’t outright disagree with them. Instead I start asking them a lot of questions about their stance, getting them to flush out what it is that they actually believe. This is much easier to do when you are still “on the same side”, as you haven’t clashed with their ego by telling them they’re wrong. From my experience, less biases are triggered when one is explaining something to someone who they think is curious, vs someone who they think is “out to get them”.
“but often I’m in the position we’re I’m like ”
I think there’s something egotistical about thinking your operating at a different intellectual level. On the other hand, that doesn’t make it false.
But I think you’re talking about rhetoric. Deep down inside I think we know we’re different intellectually (and that doesn’t need to mean smarter, it could just be insight via anti-socialness). But arguing that way isn’t very productive. One thing I’ve learned about selling is to get them to like you before they’ll buy from you. They don’t like you if you make them feel dumb, and in an insecure world it isn’t hard to make people feel dumb without trying to.