It’s a guess but I think it’s a fairly logical one. Think about all the stories of rationalists who’ve overcome a belief in God, or ESP or whatever. Seems to me that demonstrates an ability to suppress emotion and follow logic that should carry over into other areas.
As I mentioned in another comment, you can just read LW threads on contentious topics to observe as a matter of practice that LW rationalists at least are no different than other people in this respect: open only to what they’re not already opposed to.
This is relevant evidence: evidence directly connected to the topic (openness to unsolicited advice). Your evidence is not, because it describes situations where rationalists changed their minds on their own. This is really different—changing your own mind is in no way similar to being open to someone else changing your mind, since somebody else trying to change your mind creates internal resistance in a way that changing your own mind does not.
It’s like using people’s ability to walk on dry land as evidence of their ability to swim underwater, when an actual swimming test shows the people all drowning. ;-)
Since I remember your username being associated with various PUA discussions, I assume you at least partly have those in mind, and I can’t say much about those not having ever really been part of the discussion, but I’ll note that it’s a particularly contentious issue (my position has changed somewhat but given that I only had a vague awareness of the PUA community before and not through anyone who participated in or approved of them I don’t consider that especially remarkable,) and Less Wrongers seem to be more pliable than the norm on less contentious matters which still provoke significant resistance in much of the population, such as the safety of fireplaces.
Since I remember your username being associated with various PUA discussions, I assume you at least partly have those in mind, and I can’t say much about those not having ever really been part of the discussion, but I’ll note that it’s a particularly contentious issue
It’s not the only one. See any thread on cryonics, how well SIAI is doing on various dimensions, discussions of nutrition, exercise, and nootropics… it’s not hard to run across examples of similar instances of closed-mindedness on BOTH sides of a discussion.
Less Wrongers seem to be more pliable than the norm
My point is: not nearly enough.
on less contentious matters which still provoke significant resistance in much of the population, such as the safety of fireplaces.
As I mentioned in that thread, LWers skew young and for not already having fireplaces: that they’d be less attached to them is kind of a given.
This is really different—changing your own mind is in no way similar to being open to someone else changing your mind, since somebody else trying to change your mind creates internal resistance in a way that changing your own mind does not.
Not only is it similar, the abilities in those areas are significantly correlated.
Agreed. Wanting to be “the kind of person who changes their mind” means that when you get into a situation of someone else trying to change your mind, and you notice that you’re getting defensive and making excuses not to change your mind, the cognitive dissonance of not being the kind of person you want to be makes it more likely, at least some of the time, that you’ll make yourself be open to changing your mind.
This is a nice idea, but it doesn’t hold up that well under mindkilling conditions: i.e. any condition where you have a stronger, more concrete loyalty to some other chunk of your identity than being the kind of person who changes your mind, and you perceive that other identity to be threatened.
It also doesn’t apply when you’re blocked from even perceiving someone’s arguments, because your brain has already cached a conclusion as being so obvious that only an evil or lunatic person could think something so stupid. Under such a condition, the idea that there is even something to change your mind about will not occur to you: the other person will just seem to be irredeemably wrong, and instead of feeling cognitive dissonance at trying to rationalize, you will feel like you are just patiently trying to explain common sense to a lunatic or a troll.
IOW, everyone in this thread who’s using their own experience (inside view) as a guide to how rational rationalists are, is erring in not using the available outside-view evidence of how rational rationalists aren’t: your own experience doesn’t include the times where you didn’t notice you were being closed-minded, and thus your estimates will be way off.
Not only is it similar, the abilities in those areas are significantly correlated.
In order to use that ability, you have to realize it needs to be used. If someone is setting out to change their own mind, then they have already realized the need. If someone is being offered advice by others, they may or may not realize there is anything to change their mind about. It is this latter skill (noticing that there’s something to change your mind about) that I’m distinguishing from the skill of changing your mind. They are not at all similar, nor is there any particular reason for them to be correlated.
Really? You don’t think the sort of person why tries harder than average to actually change their mind more often will also try harder than average to examine various issues that they should change their mind about?
Really? You don’t think the sort of person why tries harder than average to actually change their mind more often will also try harder than average to examine various issues that they should change their mind about?
But that isn’t the issue: it’s noticing that there is something you need to examine in the first place, vs. just “knowing” that the other person is wrong.
Honestly, I don’t think that the skill of being able to change your mind is all that difficult. The real test of skill is noticing that there’s something to even consider changing your mind about in the first place. It’s much easier to notice when other people need to do it. ;-)
Inasmuch as internal reflective coherence, and a desire to self-modify (towards any goal) or even just the urge to signal that desire are not the same thing...yeah, it doesn’t seem to follow that these two traits would necessarily correlate.
Based on what evidence?
It’s a guess but I think it’s a fairly logical one. Think about all the stories of rationalists who’ve overcome a belief in God, or ESP or whatever. Seems to me that demonstrates an ability to suppress emotion and follow logic that should carry over into other areas.
As I mentioned in another comment, you can just read LW threads on contentious topics to observe as a matter of practice that LW rationalists at least are no different than other people in this respect: open only to what they’re not already opposed to.
This is relevant evidence: evidence directly connected to the topic (openness to unsolicited advice). Your evidence is not, because it describes situations where rationalists changed their minds on their own. This is really different—changing your own mind is in no way similar to being open to someone else changing your mind, since somebody else trying to change your mind creates internal resistance in a way that changing your own mind does not.
It’s like using people’s ability to walk on dry land as evidence of their ability to swim underwater, when an actual swimming test shows the people all drowning. ;-)
Since I remember your username being associated with various PUA discussions, I assume you at least partly have those in mind, and I can’t say much about those not having ever really been part of the discussion, but I’ll note that it’s a particularly contentious issue (my position has changed somewhat but given that I only had a vague awareness of the PUA community before and not through anyone who participated in or approved of them I don’t consider that especially remarkable,) and Less Wrongers seem to be more pliable than the norm on less contentious matters which still provoke significant resistance in much of the population, such as the safety of fireplaces.
It’s not the only one. See any thread on cryonics, how well SIAI is doing on various dimensions, discussions of nutrition, exercise, and nootropics… it’s not hard to run across examples of similar instances of closed-mindedness on BOTH sides of a discussion.
My point is: not nearly enough.
As I mentioned in that thread, LWers skew young and for not already having fireplaces: that they’d be less attached to them is kind of a given.
Not only is it similar, the abilities in those areas are significantly correlated.
Agreed. Wanting to be “the kind of person who changes their mind” means that when you get into a situation of someone else trying to change your mind, and you notice that you’re getting defensive and making excuses not to change your mind, the cognitive dissonance of not being the kind of person you want to be makes it more likely, at least some of the time, that you’ll make yourself be open to changing your mind.
This is a nice idea, but it doesn’t hold up that well under mindkilling conditions: i.e. any condition where you have a stronger, more concrete loyalty to some other chunk of your identity than being the kind of person who changes your mind, and you perceive that other identity to be threatened.
It also doesn’t apply when you’re blocked from even perceiving someone’s arguments, because your brain has already cached a conclusion as being so obvious that only an evil or lunatic person could think something so stupid. Under such a condition, the idea that there is even something to change your mind about will not occur to you: the other person will just seem to be irredeemably wrong, and instead of feeling cognitive dissonance at trying to rationalize, you will feel like you are just patiently trying to explain common sense to a lunatic or a troll.
IOW, everyone in this thread who’s using their own experience (inside view) as a guide to how rational rationalists are, is erring in not using the available outside-view evidence of how rational rationalists aren’t: your own experience doesn’t include the times where you didn’t notice you were being closed-minded, and thus your estimates will be way off.
In order to use that ability, you have to realize it needs to be used. If someone is setting out to change their own mind, then they have already realized the need. If someone is being offered advice by others, they may or may not realize there is anything to change their mind about. It is this latter skill (noticing that there’s something to change your mind about) that I’m distinguishing from the skill of changing your mind. They are not at all similar, nor is there any particular reason for them to be correlated.
Really? You don’t think the sort of person why tries harder than average to actually change their mind more often will also try harder than average to examine various issues that they should change their mind about?
But that isn’t the issue: it’s noticing that there is something you need to examine in the first place, vs. just “knowing” that the other person is wrong.
Honestly, I don’t think that the skill of being able to change your mind is all that difficult. The real test of skill is noticing that there’s something to even consider changing your mind about in the first place. It’s much easier to notice when other people need to do it. ;-)
Inasmuch as internal reflective coherence, and a desire to self-modify (towards any goal) or even just the urge to signal that desire are not the same thing...yeah, it doesn’t seem to follow that these two traits would necessarily correlate.
I hadn’t considered that. Ego does get in the way more when other people are involved.