I do think altruism is superior: I’m not sure how exactly to unpack ethical statements, but I believe altruism is better than egoism, definitely. I also think that ‘selfishness’ has a very well understood meaning about maximising your own happiness/power/whatever and that redefining it so that it’s selfish to do what you think is right is fairly pointless. ‘Preferences’ is a ridiculously broad term and you seem to be treating ‘people follow their preferences’ as true by definition, meaning that ‘people are selfish’ doesn’t have much content as a claim.
In practice, people aren’t perfect altruists: but defining however you act as maximising your utility function and therefore just as good as anything else is just a refusal to engage on ethics: you end up reverting to brute force (‘I cannot object ethically to the fact your utility function involves rape and murder: but I can oppose you based on my utility function’). Not sure what good moving all ethical debate to this level achieves.
Oh, and on the altrustically having sex approach: again, we live in a society where we reasonably expect non-inteference and non-deception but don’t usually expect people to actively do what they don’t want to do: a theoretical utility-maximiser might have sex with people they’re not attracted to, sure.
On valuing people: I would understand valuing someone to go beyond the level of ‘I won’t actively harm and abuse you on a whim’. Although even in the hard sense of valuing (does he care about her at all) the statement that kicked this off doesn’t demonstrate any consideration for her experience. As you note, raping/drugging etc. have bad consequences for him, and as for getting her to drop out, I imagine it would be far more effort, have far more unpredictable results (her or friends might end up getting revenge for you screwing up her life) and not worth it if he just wants sex.
a theoretical utility-maximiser might have sex with people they’re not attracted to, sure
It depends on what their utility function is—assuming the orthogonality thesis, for any X whatsoever there’s a theoretical utility maximiser who might do X, so that’s not terribly informative about X.
I do think altruism is superior: I’m not sure how exactly to unpack ethical statements, but I believe altruism is better than egoism, definitely. I also think that ‘selfishness’ has a very well understood meaning about maximising your own happiness/power/whatever and that redefining it so that it’s selfish to do what you think is right is fairly pointless. ‘Preferences’ is a ridiculously broad term and you seem to be treating ‘people follow their preferences’ as true by definition, meaning that ‘people are selfish’ doesn’t have much content as a claim.
In practice, people aren’t perfect altruists: but defining however you act as maximising your utility function and therefore just as good as anything else is just a refusal to engage on ethics: you end up reverting to brute force (‘I cannot object ethically to the fact your utility function involves rape and murder: but I can oppose you based on my utility function’). Not sure what good moving all ethical debate to this level achieves.
Oh, and on the altrustically having sex approach: again, we live in a society where we reasonably expect non-inteference and non-deception but don’t usually expect people to actively do what they don’t want to do: a theoretical utility-maximiser might have sex with people they’re not attracted to, sure.
On valuing people: I would understand valuing someone to go beyond the level of ‘I won’t actively harm and abuse you on a whim’. Although even in the hard sense of valuing (does he care about her at all) the statement that kicked this off doesn’t demonstrate any consideration for her experience. As you note, raping/drugging etc. have bad consequences for him, and as for getting her to drop out, I imagine it would be far more effort, have far more unpredictable results (her or friends might end up getting revenge for you screwing up her life) and not worth it if he just wants sex.
It depends on what their utility function is—assuming the orthogonality thesis, for any X whatsoever there’s a theoretical utility maximiser who might do X, so that’s not terribly informative about X.