But that presupposes that I value cooperation with you. I don’t think it’s possible to get moral weights from an outside source even in principle; you have to decide that the outside source in question is worth it, which implies you are weighing it against your actual, internal values.
It’s like how selfless action is impossible; if I want to save someone’s life, it’s because I value that person’s life in my own utility function. Even if I sacrifice my own life to save someone, I’m still doing it for some internal reason; I’m satisfying my own, personal values, and they happen to say that the other person’s life is worth more.
But that presupposes that I value cooperation with you. I don’t think it’s possible to get moral weights from an outside source even in principle; you have to decide that the outside source in question is worth it, which implies you are weighing it against your actual, internal values.
I think you’re mixing up levels, here. You have your internal values, by which you decide that you like being alive and doing your thing, and I have my internal values, by which I decide that I like being alive and doing my thing. Then there’s the local king, who decides that if we don’t play by his rules, his servants will imprison or kill us. You and I both look at our values and decide that it’s better to play by the king’s rules than not play by the king’s rules.
If one of those rules is “enforce my rules,” now when the two of us meet we both expect the other to be playing by the king’s rules and willing to punish us for not playing by the king’s rules. This is way better than not having any expectations about the other person.
Moral talk is basically “what are the rules that we are both playing by? What should they be?”. It would be bad if I pulled the lever to save five people, thinking that this would make me a hero, and then I get shamed or arrested for causing the death of the one person. The reasons to play by the rules at all are personal: appreciating following the rules in an internal way, appreciating other people’s appreciation of you, and fearing other people’s reprisal if you violate the rules badly enough.
If the king was a dictator and forced everyone to torture innocent people, it would still be against my morals to torture people, regardless of whether I had to do it or not. I can’t decide to adopt the king’s moral weights, no matter how much it may assuage my guilt. This is what I mean when I say it is not possible to get moral weights from an outside source. I may be playing by the king’s rules, but only because I value my life above all else, and it’s drowning out the rest of my utility function.
On a related note, is this an example of a intrapersonal utility monster? All my goals are being thrown under the bus except for one, which I value most highly.
Your example of the King who wants you to torture is extreme, and doesnt generalize … you have set up not torturing as a non-negotiable absolute imperative. A more steelmanned case would be compromising on negotiable principles at the behest of society at large.
But that presupposes that I value cooperation with you. I don’t think it’s possible to get moral weights from an outside source even in principle; you have to decide that the outside source in question is worth it, which implies you are weighing it against your actual, internal values.
It’s like how selfless action is impossible; if I want to save someone’s life, it’s because I value that person’s life in my own utility function. Even if I sacrifice my own life to save someone, I’m still doing it for some internal reason; I’m satisfying my own, personal values, and they happen to say that the other person’s life is worth more.
I think you’re mixing up levels, here. You have your internal values, by which you decide that you like being alive and doing your thing, and I have my internal values, by which I decide that I like being alive and doing my thing. Then there’s the local king, who decides that if we don’t play by his rules, his servants will imprison or kill us. You and I both look at our values and decide that it’s better to play by the king’s rules than not play by the king’s rules.
If one of those rules is “enforce my rules,” now when the two of us meet we both expect the other to be playing by the king’s rules and willing to punish us for not playing by the king’s rules. This is way better than not having any expectations about the other person.
Moral talk is basically “what are the rules that we are both playing by? What should they be?”. It would be bad if I pulled the lever to save five people, thinking that this would make me a hero, and then I get shamed or arrested for causing the death of the one person. The reasons to play by the rules at all are personal: appreciating following the rules in an internal way, appreciating other people’s appreciation of you, and fearing other people’s reprisal if you violate the rules badly enough.
If the king was a dictator and forced everyone to torture innocent people, it would still be against my morals to torture people, regardless of whether I had to do it or not. I can’t decide to adopt the king’s moral weights, no matter how much it may assuage my guilt. This is what I mean when I say it is not possible to get moral weights from an outside source. I may be playing by the king’s rules, but only because I value my life above all else, and it’s drowning out the rest of my utility function.
On a related note, is this an example of a intrapersonal utility monster? All my goals are being thrown under the bus except for one, which I value most highly.
Your example of the King who wants you to torture is extreme, and doesnt generalize … you have set up not torturing as a non-negotiable absolute imperative. A more steelmanned case would be compromising on negotiable principles at the behest of society at large.