LW was the first place I’ve been where women caring about their own interests is viewed as a weird inimical trait which it’s only reasonable to subvert, and I’m talking about PUA.
Could you give some examples? I’m having trouble thinking of any.
The general idea that women not being attracted to men who are attracted to them is just some arbitrary wrongness in the universe that any sensible man should try to get the women to ignore.
The general idea that women not being attracted to men who are attracted to them is just some arbitrary wrongness in the universe that any sensible man should try to get the women to ignore.
Fixing the man (as opposed to confusing the woman) seems like a good intervention, if it’s possible to a sufficient extent. The difficulty is that behavior and appearance are important aspects of a person, so fixing someone might involve fixing their behavior and appearance, which will be superficially similar to changing their behavior and appearance with the goal of confusion/deception. This apparently inescapable superficial similarity opens benevolent self-improvement in this area to the charge of deception, and it looks like it’s often hard for both sides to avoid mixing up the categories.
The general idea that women not being attracted to men who are attracted to them is just some arbitrary wrongness in the universe
Well, if they were attracted to the men attracted to them this would increase total utility. One of the less pleasant implications of utilitarianism.
On the other hand, it’s interesting that people are willing to swallow pushing people in front of trolleys, but not swallow the above. Probably related to this.
The general idea that women not being attracted to men who are attracted to them is just some arbitrary wrongness in the universe
Well, if they were attracted to the men attracted to them this would increase total utility. One of the less pleasant implications of utilitarianism.
This is only an implication of utilitarianism to the extent that forcibly wireheading everyone is an implication of utilitarianism. However, given some of your other remarks about unpleasant truths conflicting with social conformity, I doubt if you intended your comment as an argument against utilitarianism, but rather as an argument for PUA. Am I reading the tea-leaves correctly here?
This is only an implication of utilitarianism to the extent that forcibly wireheading everyone is an implication of utilitarianism.
Well, one can deal with wireheading by declaring that wireheads don’t count towards utility and/or have negative utility. That approach doesn’t work in this case since we don’t want to assign negative utility to the state of two people being attracted to each other.
I doubt if you intended your comment as an argument against utilitarianism, but rather as an argument for PUA. Am I reading the tea-leaves correctly here?
Why can’t I do both? After all, the correct Bayesian response to discovering that two ideas seem to contradict is decrease one’s confidence in both.
Well, one can deal with wireheading by declaring that wireheads don’t count towards utility and/or have negative utility.
One can deal with any counterexample by declaring that it “doesn’t count”. That does not make it not count. Wireheads, by definition, experience huge utility. That is what the word means, in discussions of utilitarianism.
That approach doesn’t work in this case since we don’t want to assign negative utility to the state of two people being attracted to each other.
We might very well want to assign negative utility to the process whereby that happened, for the same reasons as for forcible wireheading.
I doubt if you intended your comment as an argument against utilitarianism, but rather as an argument for PUA. Am I reading the tea-leaves correctly here?
Why can’t I do both?
That is just a way of not saying what you do. Do, you, in fact, do both, and how much of each?
After all, the correct Bayesian response to discovering that two ideas seem to contradict is decrease one’s confidence in both.
The correct rational response is to resolve the contradiction, not to ignore it and utter platitudes about the truth lying between extremes. Dressing the latter up in rationalist jargon does not change that.
We might very well want to assign negative utility to the process whereby that happened, for the same reasons as for forcible wireheading.
That’s my point, you need to assign utility to processes rather than just outcomes.
That is just a way of not saying what you do. Do, you, in fact, do both, and how much of each?
I am in fact doing both, in this case mostly against utilitarianism.
The correct rational response is to resolve the contradiction, not to ignore it and utter platitudes about the truth lying between extremes.
There is a difference between assuming the truth lies between two extremes, and assigning significant probability (say ~50%) to each of the two extremes. I’m trying to do the latter.
The general idea that women not being attracted to men who are attracted to them is just some arbitrary wrongness in the universe that any sensible man should try to get the women to ignore.
Fixing the man (as opposed to confusing the woman) seems like a good intervention, if it’s possible to a sufficient extent. The difficulty is that behavior and appearance are important aspects of a person, so fixing someone might involve fixing their behavior and appearance, which will be superficially similar to changing their behavior and appearance with the goal of confusion/deception. This apparently inescapable superficial similarity opens benevolent self-improvement in this area to the charge of deception, and it looks like it’s often hard for both sides to avoid mixing up the categories.
o.O
Seriously? I mean, everyone wants to be more attractive, but … that’s a very, well, psychopath-y way of looking at it.
I think I’ve somehow managed not to run into this, do you have any links?
Well, if they were attracted to the men attracted to them this would increase total utility. One of the less pleasant implications of utilitarianism.
On the other hand, it’s interesting that people are willing to swallow pushing people in front of trolleys, but not swallow the above. Probably related to this.
This is only an implication of utilitarianism to the extent that forcibly wireheading everyone is an implication of utilitarianism. However, given some of your other remarks about unpleasant truths conflicting with social conformity, I doubt if you intended your comment as an argument against utilitarianism, but rather as an argument for PUA. Am I reading the tea-leaves correctly here?
Well, one can deal with wireheading by declaring that wireheads don’t count towards utility and/or have negative utility. That approach doesn’t work in this case since we don’t want to assign negative utility to the state of two people being attracted to each other.
Why can’t I do both? After all, the correct Bayesian response to discovering that two ideas seem to contradict is decrease one’s confidence in both.
One can deal with any counterexample by declaring that it “doesn’t count”. That does not make it not count. Wireheads, by definition, experience huge utility. That is what the word means, in discussions of utilitarianism.
We might very well want to assign negative utility to the process whereby that happened, for the same reasons as for forcible wireheading.
That is just a way of not saying what you do. Do, you, in fact, do both, and how much of each?
The correct rational response is to resolve the contradiction, not to ignore it and utter platitudes about the truth lying between extremes. Dressing the latter up in rationalist jargon does not change that.
That’s my point, you need to assign utility to processes rather than just outcomes.
I am in fact doing both, in this case mostly against utilitarianism.
There is a difference between assuming the truth lies between two extremes, and assigning significant probability (say ~50%) to each of the two extremes. I’m trying to do the latter.