E.g. you could satisfy both values by helping build a (non-sentient) simulation through which they can satisfy their desire to kill you without actually killing you.
But really I think the problem is that when we refer to individual actions as if they’re terminal values, it’s difficult to compromise—true terminal values tend however to be more personal than that.
E.g. you could satisfy both values by helping build a (non-sentient) simulation through which they can satisfy their desire to kill you without actually killing you.
But really I think the problem is that when we refer to individual actions as if they’re terminal values, it’s difficult to compromise—true terminal values tend however to be more personal than that.