I was referring though to the case of your friend using reinforcement to alter your behavior in a way that would benefit you. I just have a hard time seeing someone trying to help you as an unethical behavior.
That’s fair. I should tone down my point and say that doing this sort of thing is disrespectful, not evil or anything. Its the sort of thing parents and teachers do with kids. With your peers, unsolicited reinforcement training is seen as disrespectful because it stands in leau of just explaing to the person what you think they should be doing.
Often it is, we agree. But it’s the ‘telling’ there that’s the problem. A respectful way to modify someone’s behavior is to convince them to do something different (which may mean convincing them to subject themselves to positive reinforcement training). The difference is often whether we appeal to someone’s rationality, or take a run at their emotions.
A respectful way to modify someone’s behavior is to convince them to do something different
I agree that there are respectful ways to convince me to do something different, thereby respectfully modifying my behavior. Many of those ways involve appealing to my rationality. Many of those ways involve appealing to my emotions.
There are also disrespectful ways to convince me to do something different. Many of those ways involve appealing to my rationality. Many of those ways involve appealing to my emotions.
There are also disrespectful ways to convince me to do something different.
Many of those ways involve appealing to my rationality.
So, by ‘appealing to someone’s rationality’ I mean, at least, arguing honestly. Perhaps I should have specified that. Do you still think there are such examples?
Sure. Suppose I believe my husband is a foolish, clumsy, unattractive oaf, and I want him to take dance lessons. Suppose I say to him, “Hey, husband! You are a foolish, clumsy, unattractive oaf. If you take dance lessons, you will be less clumsy. That’s a good thing. Go take dance lessons!” I would say, in that situation, I have presented an honest, disrespectful argument to my husband with the intention of convincing him to do something different.
I agree completely that my example is disrespectful in virtue of (in vice of?) something other than its appeal to reason.
If that makes it a poor example of what you’re asking for, I misunderstood what you were asking for. Which, given that you’re repeatedly asking me for “an example” without actually saying precisely what you want an example of, is not too surprising.
So, perhaps it’s best to back all the way out. If there’s something specific you’d like me to provide an example of, and you can tell me what it is, I’ll try to provide an example of it if I can. If there isn’t, or you can’t, that’s OK too and we can drop this here.
Well this runs into the problem of giving unsolicited advice. Most people don’t respond well to that. I think it’s probably difficult for most rationalists to remember this since we are probably more open to that.
Well this runs into the problem of giving unsolicited advice. Most people don’t respond well to that. I think it’s probably difficult for most rationalists to remember this since we are probably more open to that.
Not really. Rationalists are just open to different advice. There’s lots of advice rationalists will reject out of hand. (Some of which is actually bad advice, and some of which is not.)
Everyone believes themselves to be open-minded; the catch is that we’re all open to what we’re open to, and not open to what we’re not.
Well I agree that none of us is completely rational when it comes to accepting advice. But don’t you think rationalists are at least better at that than most people?
It’s a guess but I think it’s a fairly logical one. Think about all the stories of rationalists who’ve overcome a belief in God, or ESP or whatever. Seems to me that demonstrates an ability to suppress emotion and follow logic that should carry over into other areas.
As I mentioned in another comment, you can just read LW threads on contentious topics to observe as a matter of practice that LW rationalists at least are no different than other people in this respect: open only to what they’re not already opposed to.
This is relevant evidence: evidence directly connected to the topic (openness to unsolicited advice). Your evidence is not, because it describes situations where rationalists changed their minds on their own. This is really different—changing your own mind is in no way similar to being open to someone else changing your mind, since somebody else trying to change your mind creates internal resistance in a way that changing your own mind does not.
It’s like using people’s ability to walk on dry land as evidence of their ability to swim underwater, when an actual swimming test shows the people all drowning. ;-)
Since I remember your username being associated with various PUA discussions, I assume you at least partly have those in mind, and I can’t say much about those not having ever really been part of the discussion, but I’ll note that it’s a particularly contentious issue (my position has changed somewhat but given that I only had a vague awareness of the PUA community before and not through anyone who participated in or approved of them I don’t consider that especially remarkable,) and Less Wrongers seem to be more pliable than the norm on less contentious matters which still provoke significant resistance in much of the population, such as the safety of fireplaces.
Since I remember your username being associated with various PUA discussions, I assume you at least partly have those in mind, and I can’t say much about those not having ever really been part of the discussion, but I’ll note that it’s a particularly contentious issue
It’s not the only one. See any thread on cryonics, how well SIAI is doing on various dimensions, discussions of nutrition, exercise, and nootropics… it’s not hard to run across examples of similar instances of closed-mindedness on BOTH sides of a discussion.
Less Wrongers seem to be more pliable than the norm
My point is: not nearly enough.
on less contentious matters which still provoke significant resistance in much of the population, such as the safety of fireplaces.
As I mentioned in that thread, LWers skew young and for not already having fireplaces: that they’d be less attached to them is kind of a given.
This is really different—changing your own mind is in no way similar to being open to someone else changing your mind, since somebody else trying to change your mind creates internal resistance in a way that changing your own mind does not.
Not only is it similar, the abilities in those areas are significantly correlated.
Agreed. Wanting to be “the kind of person who changes their mind” means that when you get into a situation of someone else trying to change your mind, and you notice that you’re getting defensive and making excuses not to change your mind, the cognitive dissonance of not being the kind of person you want to be makes it more likely, at least some of the time, that you’ll make yourself be open to changing your mind.
This is a nice idea, but it doesn’t hold up that well under mindkilling conditions: i.e. any condition where you have a stronger, more concrete loyalty to some other chunk of your identity than being the kind of person who changes your mind, and you perceive that other identity to be threatened.
It also doesn’t apply when you’re blocked from even perceiving someone’s arguments, because your brain has already cached a conclusion as being so obvious that only an evil or lunatic person could think something so stupid. Under such a condition, the idea that there is even something to change your mind about will not occur to you: the other person will just seem to be irredeemably wrong, and instead of feeling cognitive dissonance at trying to rationalize, you will feel like you are just patiently trying to explain common sense to a lunatic or a troll.
IOW, everyone in this thread who’s using their own experience (inside view) as a guide to how rational rationalists are, is erring in not using the available outside-view evidence of how rational rationalists aren’t: your own experience doesn’t include the times where you didn’t notice you were being closed-minded, and thus your estimates will be way off.
Not only is it similar, the abilities in those areas are significantly correlated.
In order to use that ability, you have to realize it needs to be used. If someone is setting out to change their own mind, then they have already realized the need. If someone is being offered advice by others, they may or may not realize there is anything to change their mind about. It is this latter skill (noticing that there’s something to change your mind about) that I’m distinguishing from the skill of changing your mind. They are not at all similar, nor is there any particular reason for them to be correlated.
Really? You don’t think the sort of person why tries harder than average to actually change their mind more often will also try harder than average to examine various issues that they should change their mind about?
Really? You don’t think the sort of person why tries harder than average to actually change their mind more often will also try harder than average to examine various issues that they should change their mind about?
But that isn’t the issue: it’s noticing that there is something you need to examine in the first place, vs. just “knowing” that the other person is wrong.
Honestly, I don’t think that the skill of being able to change your mind is all that difficult. The real test of skill is noticing that there’s something to even consider changing your mind about in the first place. It’s much easier to notice when other people need to do it. ;-)
Inasmuch as internal reflective coherence, and a desire to self-modify (towards any goal) or even just the urge to signal that desire are not the same thing...yeah, it doesn’t seem to follow that these two traits would necessarily correlate.
This feels like an equivocating-shades-of-grey argument, of the form ‘nobody is perfectly receptive to good arguments, and perfectly unswayed by bad ones, therefore, everyone is equally bad at it.’ Which is, of course, unjustified. In truth, if rationalists are not at least somewhat more swayed by good arguments than bad ones (as compared to the general population), we’re doing something wrong.
Which is, of course, unjustified. In truth, if rationalists are not at least somewhat more swayed by good arguments than bad ones (as compared to the general population), we’re doing something wrong.
Not really, we’re just equally susceptible to irrational biases.
Trivial proof for LW rationalists: read any LW thread regarding a controversial self-improvement topic, including nutrition, exercise, dating advice, etc., where people are diametrically opposed in their positions, using every iota of their argumentative reasoning power in order not to open themselves to even understanding their opponents’ position, let alone reasoning about it. It is extremely improbable that all divisive advice (including diametrically-opposed divisive advice) is incorrect, and therefore the bulk of LW rationalists are correctly rejecting it.
(Side note: I didn’t say anything about receptiveness to good arguments, I said receptiveness to unsolicited advice, as did the comment I was replying to. I actually assumed that we were talking about bad arguments, since most arguments, on average, are bad. My point was more that there are many topics which rationalists will reject out of hand without even bothering to listen to the arguments, good or bad, and that in this, they are just like any other human being. The point isn’t to invoke a fallacy of the grey, the point is for rationalists not to pat ourselves on the back in thinking we’re demonstrably better at this than other human beings: demonstrably, we’re not.)
I was referring though to the case of your friend using reinforcement to alter your behavior in a way that would benefit you. I just have a hard time seeing someone trying to help you as an unethical behavior.
It does depend on whose definition of ‘help’ they’re using.
Good point. Do you think it would be ethical if they were helping to fulfill your preferences?
Usually, yes, though there are several qualifications and corner cases.
Agreed, there probably are.
That’s fair. I should tone down my point and say that doing this sort of thing is disrespectful, not evil or anything. Its the sort of thing parents and teachers do with kids. With your peers, unsolicited reinforcement training is seen as disrespectful because it stands in leau of just explaing to the person what you think they should be doing.
In my experience, telling other people how I think they should behave is also often seen as disrespectful.
Often it is, we agree. But it’s the ‘telling’ there that’s the problem. A respectful way to modify someone’s behavior is to convince them to do something different (which may mean convincing them to subject themselves to positive reinforcement training). The difference is often whether we appeal to someone’s rationality, or take a run at their emotions.
I agree that there are respectful ways to convince me to do something different, thereby respectfully modifying my behavior.
Many of those ways involve appealing to my rationality.
Many of those ways involve appealing to my emotions.
There are also disrespectful ways to convince me to do something different.
Many of those ways involve appealing to my rationality.
Many of those ways involve appealing to my emotions.
So, by ‘appealing to someone’s rationality’ I mean, at least, arguing honestly. Perhaps I should have specified that. Do you still think there are such examples?
Do I think there are disrespectful ways to convince me to do something different that involve arguing honestly? Sure. Do you not?
Not that I can think of, no. Can you think of an example?
Sure. Suppose I believe my husband is a foolish, clumsy, unattractive oaf, and I want him to take dance lessons. Suppose I say to him, “Hey, husband! You are a foolish, clumsy, unattractive oaf. If you take dance lessons, you will be less clumsy. That’s a good thing. Go take dance lessons!” I would say, in that situation, I have presented an honest, disrespectful argument to my husband with the intention of convincing him to do something different.
That’s not really a very good example. That in virtue of which its disrespectful is unconnected to that in virtue of which it appeals to reason.
I agree completely that my example is disrespectful in virtue of (in vice of?) something other than its appeal to reason.
If that makes it a poor example of what you’re asking for, I misunderstood what you were asking for. Which, given that you’re repeatedly asking me for “an example” without actually saying precisely what you want an example of, is not too surprising.
So, perhaps it’s best to back all the way out. If there’s something specific you’d like me to provide an example of, and you can tell me what it is, I’ll try to provide an example of it if I can. If there isn’t, or you can’t, that’s OK too and we can drop this here.
Well this runs into the problem of giving unsolicited advice. Most people don’t respond well to that. I think it’s probably difficult for most rationalists to remember this since we are probably more open to that.
Not really. Rationalists are just open to different advice. There’s lots of advice rationalists will reject out of hand. (Some of which is actually bad advice, and some of which is not.)
Everyone believes themselves to be open-minded; the catch is that we’re all open to what we’re open to, and not open to what we’re not.
Well I agree that none of us is completely rational when it comes to accepting advice. But don’t you think rationalists are at least better at that than most people?
Based on what evidence?
It’s a guess but I think it’s a fairly logical one. Think about all the stories of rationalists who’ve overcome a belief in God, or ESP or whatever. Seems to me that demonstrates an ability to suppress emotion and follow logic that should carry over into other areas.
As I mentioned in another comment, you can just read LW threads on contentious topics to observe as a matter of practice that LW rationalists at least are no different than other people in this respect: open only to what they’re not already opposed to.
This is relevant evidence: evidence directly connected to the topic (openness to unsolicited advice). Your evidence is not, because it describes situations where rationalists changed their minds on their own. This is really different—changing your own mind is in no way similar to being open to someone else changing your mind, since somebody else trying to change your mind creates internal resistance in a way that changing your own mind does not.
It’s like using people’s ability to walk on dry land as evidence of their ability to swim underwater, when an actual swimming test shows the people all drowning. ;-)
Since I remember your username being associated with various PUA discussions, I assume you at least partly have those in mind, and I can’t say much about those not having ever really been part of the discussion, but I’ll note that it’s a particularly contentious issue (my position has changed somewhat but given that I only had a vague awareness of the PUA community before and not through anyone who participated in or approved of them I don’t consider that especially remarkable,) and Less Wrongers seem to be more pliable than the norm on less contentious matters which still provoke significant resistance in much of the population, such as the safety of fireplaces.
It’s not the only one. See any thread on cryonics, how well SIAI is doing on various dimensions, discussions of nutrition, exercise, and nootropics… it’s not hard to run across examples of similar instances of closed-mindedness on BOTH sides of a discussion.
My point is: not nearly enough.
As I mentioned in that thread, LWers skew young and for not already having fireplaces: that they’d be less attached to them is kind of a given.
Not only is it similar, the abilities in those areas are significantly correlated.
Agreed. Wanting to be “the kind of person who changes their mind” means that when you get into a situation of someone else trying to change your mind, and you notice that you’re getting defensive and making excuses not to change your mind, the cognitive dissonance of not being the kind of person you want to be makes it more likely, at least some of the time, that you’ll make yourself be open to changing your mind.
This is a nice idea, but it doesn’t hold up that well under mindkilling conditions: i.e. any condition where you have a stronger, more concrete loyalty to some other chunk of your identity than being the kind of person who changes your mind, and you perceive that other identity to be threatened.
It also doesn’t apply when you’re blocked from even perceiving someone’s arguments, because your brain has already cached a conclusion as being so obvious that only an evil or lunatic person could think something so stupid. Under such a condition, the idea that there is even something to change your mind about will not occur to you: the other person will just seem to be irredeemably wrong, and instead of feeling cognitive dissonance at trying to rationalize, you will feel like you are just patiently trying to explain common sense to a lunatic or a troll.
IOW, everyone in this thread who’s using their own experience (inside view) as a guide to how rational rationalists are, is erring in not using the available outside-view evidence of how rational rationalists aren’t: your own experience doesn’t include the times where you didn’t notice you were being closed-minded, and thus your estimates will be way off.
In order to use that ability, you have to realize it needs to be used. If someone is setting out to change their own mind, then they have already realized the need. If someone is being offered advice by others, they may or may not realize there is anything to change their mind about. It is this latter skill (noticing that there’s something to change your mind about) that I’m distinguishing from the skill of changing your mind. They are not at all similar, nor is there any particular reason for them to be correlated.
Really? You don’t think the sort of person why tries harder than average to actually change their mind more often will also try harder than average to examine various issues that they should change their mind about?
But that isn’t the issue: it’s noticing that there is something you need to examine in the first place, vs. just “knowing” that the other person is wrong.
Honestly, I don’t think that the skill of being able to change your mind is all that difficult. The real test of skill is noticing that there’s something to even consider changing your mind about in the first place. It’s much easier to notice when other people need to do it. ;-)
Inasmuch as internal reflective coherence, and a desire to self-modify (towards any goal) or even just the urge to signal that desire are not the same thing...yeah, it doesn’t seem to follow that these two traits would necessarily correlate.
I hadn’t considered that. Ego does get in the way more when other people are involved.
This feels like an equivocating-shades-of-grey argument, of the form ‘nobody is perfectly receptive to good arguments, and perfectly unswayed by bad ones, therefore, everyone is equally bad at it.’ Which is, of course, unjustified. In truth, if rationalists are not at least somewhat more swayed by good arguments than bad ones (as compared to the general population), we’re doing something wrong.
Not really, we’re just equally susceptible to irrational biases.
Trivial proof for LW rationalists: read any LW thread regarding a controversial self-improvement topic, including nutrition, exercise, dating advice, etc., where people are diametrically opposed in their positions, using every iota of their argumentative reasoning power in order not to open themselves to even understanding their opponents’ position, let alone reasoning about it. It is extremely improbable that all divisive advice (including diametrically-opposed divisive advice) is incorrect, and therefore the bulk of LW rationalists are correctly rejecting it.
(Side note: I didn’t say anything about receptiveness to good arguments, I said receptiveness to unsolicited advice, as did the comment I was replying to. I actually assumed that we were talking about bad arguments, since most arguments, on average, are bad. My point was more that there are many topics which rationalists will reject out of hand without even bothering to listen to the arguments, good or bad, and that in this, they are just like any other human being. The point isn’t to invoke a fallacy of the grey, the point is for rationalists not to pat ourselves on the back in thinking we’re demonstrably better at this than other human beings: demonstrably, we’re not.)
It amuses me how readily my brain offered “I am not neither open-minded!” as a response to that.