It’s not a utility monster scenario. The king doesn’t receive more happiness than other beings per a unit of resources; he’s a normal human being, just like all the others. While utility sum allows utility monsters, which seems bad, your method of “if some of the people are happy, then it’s just subjective” allows a reverse Omelas, which seems worse. It reminds me a bit of deontologists who criticize utilitarianism while allowing much worse things if applied consistently. Regarding the second part, I’m not against rules or limits or even against suffering. I just think that a much better game is possible that respects more conscious beings. No more bullshit like kids that are born with cancer and just spend their life dying in misery, or sea turtles that come into existence only to be eaten by predators, and so on and so forth. Video games are a good example; they have rules and limitations and loss conditions, but they are engineered with the player in mind and for his benefit, while in life, conscious beings are not promised interesting or fair experiences and might be just randomly tortured.
Ok, sorry, I phrased that wrong—I know the scenario you described isn’t a utility monster one, but it can be turned into one simply by running up the knob of how much the king enjoys himself, all while being just as unfair, so it’s not really like total utility captures the thing you feel is actually wrong here, is my point. I actually did write something more on this (though in a humorous tone) in this post.
I don’t mean that “it’s subjective” fixes everything. I just mean that it’s also the reason why it’s not entirely right IMO to write off an entire universe based on total utility. Like, my intuition is that if we had a universe with net total negative utility it still wouldn’t be right to just snap our fingers with the Infinity Gauntlet and make it disappears if it wasn’t the case that every single sentient in it, individually, was genuinely miserable to the point of wanting to die but being unable to.
Regarding the second part, I’m not against rules or limits or even against suffering.
The reason why I bring up rules and limits is more to stress how much our morality—the same by which we judge the wrongness of the universe—is borne of that universe’s own internal logic. For example, if we see someone drowning, we think it’s right to help them because we know that drowning is a thing that can happen to you without your consent (and because we estimate that ultimately on expectation it’s more likely that you wish to live than to die). I don’t mean that we can’t judge the flaws of the universe, but that our moral instincts are probably a better guide to what it takes to improve this universe, from inside (since they were shaped by it) than to what it would take to create a better one from scratch.
Video games are a good example; they have rules and limitations and loss conditions, but they are engineered with the player in mind and for his benefit, while in life, conscious beings are not promised interesting or fair experiences and might be just randomly tortured.
True, but also, with the same power as a programmer over a game, you could as well engineer a game to be positively torturous to its players. Purposefully frustrating and unfair. As things are, I think our universe is just not intelligently designed—neither to make us happy nor to make us miserable. Absent intent, this is what indifference looks like; and I think the asymmetry (where even indifference seems to result more in suffering than pleasure) is just a natural statistical outcome of how enjoyable states are just rarer and more specific than neutral or painful ones.
It’s not a utility monster scenario. The king doesn’t receive more happiness than other beings per a unit of resources; he’s a normal human being, just like all the others. While utility sum allows utility monsters, which seems bad, your method of “if some of the people are happy, then it’s just subjective” allows a reverse Omelas, which seems worse. It reminds me a bit of deontologists who criticize utilitarianism while allowing much worse things if applied consistently.
Regarding the second part, I’m not against rules or limits or even against suffering. I just think that a much better game is possible that respects more conscious beings. No more bullshit like kids that are born with cancer and just spend their life dying in misery, or sea turtles that come into existence only to be eaten by predators, and so on and so forth.
Video games are a good example; they have rules and limitations and loss conditions, but they are engineered with the player in mind and for his benefit, while in life, conscious beings are not promised interesting or fair experiences and might be just randomly tortured.
Ok, sorry, I phrased that wrong—I know the scenario you described isn’t a utility monster one, but it can be turned into one simply by running up the knob of how much the king enjoys himself, all while being just as unfair, so it’s not really like total utility captures the thing you feel is actually wrong here, is my point. I actually did write something more on this (though in a humorous tone) in this post.
I don’t mean that “it’s subjective” fixes everything. I just mean that it’s also the reason why it’s not entirely right IMO to write off an entire universe based on total utility. Like, my intuition is that if we had a universe with net total negative utility it still wouldn’t be right to just snap our fingers with the Infinity Gauntlet and make it disappears if it wasn’t the case that every single sentient in it, individually, was genuinely miserable to the point of wanting to die but being unable to.
The reason why I bring up rules and limits is more to stress how much our morality—the same by which we judge the wrongness of the universe—is borne of that universe’s own internal logic. For example, if we see someone drowning, we think it’s right to help them because we know that drowning is a thing that can happen to you without your consent (and because we estimate that ultimately on expectation it’s more likely that you wish to live than to die). I don’t mean that we can’t judge the flaws of the universe, but that our moral instincts are probably a better guide to what it takes to improve this universe, from inside (since they were shaped by it) than to what it would take to create a better one from scratch.
True, but also, with the same power as a programmer over a game, you could as well engineer a game to be positively torturous to its players. Purposefully frustrating and unfair. As things are, I think our universe is just not intelligently designed—neither to make us happy nor to make us miserable. Absent intent, this is what indifference looks like; and I think the asymmetry (where even indifference seems to result more in suffering than pleasure) is just a natural statistical outcome of how enjoyable states are just rarer and more specific than neutral or painful ones.