utilities are undefined, it is undefined if you pay up or not, the actions chosen are undefined. Akin to maximizing blerg without any definition of what blerg even is—maximizing “expected utility” without having defined it.
Call it undefined if you like, but I’d still prefer 3^^^3 people not suffer. It would be pretty weird to argue that human lives decay in utility based on how many there are. If you found out that the universe was bigger than you thought, that there really were far more humans in the universe somehow, would you just stop caring about human life?
It would also be pretty hard to argue that at least some small amount of money isn’t worth giving in order to save a human life, or that giving a small amount of money isn’t worth a small probability of saving enough lives to make up for the small probability.
It would be pretty weird to argue that human lives decay in utility based on how many there are.
Well, suppose there’s mind uploads, and one mind upload is very worried about himself so he runs himself multiply redundant with 5 exact copies. Should this upload be a minor utility monster?
3^^^3 is far more than there are possible people.
If you found out that the universe was bigger than you thought, that there really were far more humans in the universe somehow, would you just stop caring about human life?
Bounded doesn’t mean it just hits a cap and stays there. Also, if you scale all utilities that you can effect down it changes nothing about actions (another confusion—mapping the utility to how much one cares).
And yes there are definitely cases where money are worth small probability of saving lives, and everyone agrees on such—e.g. if we find out that an asteroid has certain chance to hit Earth, we’d give money to space agencies, even when chance is rather minute (we’d not give money to cold fusion crackpots though). There’s nothing fundamentally wrong with spending a bit to avert a small probability of something terrible happening. The problem arises when the probability is overestimated, when the consequences are poorly evaluated, etc. It is actively harmful for example to encourage boys to cry wolf needlessly. I’m thinking people sort of feel innately that if they are giving money away—losing—some giant fairness fairy is going to make the result more likely good than bad for everyone. World doesn’t work like this; all those naive folks who jump on opportunity to give money to someone promising to save the world, no matter how ignorant, uneducated, or crackpotty that person is in the fields where correctness can be checked at all, they are increasing risk, not decreasing.
It would be pretty weird to argue that human lives decay in utility based on how many there are.
Maybe not as weird as all that. Given a forced choice between killing A and B where I know nothing about them, I flip a coin, but add the knowledge that A is a duplicate of C and B is not a duplicate of anyone, and I choose A quite easily. I conclude from this that I value unique human lives quite a lot more than I value non-unique human lives. As others have pointed out, the number of unique human lives is finite, and the number of lives I consider worth living necessarily even lower, so the more people there are living lives worth living, the less unique any individual is, and therefore the less I value any individual life. (Insofar as my values are consistent, anyway. Which of course they aren’t, but this whole “lets pretend” game of utility calculation that we enjoy playing depends on treating them as though they were.)
Call it undefined if you like, but I’d still prefer 3^^^3 people not suffer. It would be pretty weird to argue that human lives decay in utility based on how many there are. If you found out that the universe was bigger than you thought, that there really were far more humans in the universe somehow, would you just stop caring about human life?
It would also be pretty hard to argue that at least some small amount of money isn’t worth giving in order to save a human life, or that giving a small amount of money isn’t worth a small probability of saving enough lives to make up for the small probability.
Well, suppose there’s mind uploads, and one mind upload is very worried about himself so he runs himself multiply redundant with 5 exact copies. Should this upload be a minor utility monster?
3^^^3 is far more than there are possible people.
Bounded doesn’t mean it just hits a cap and stays there. Also, if you scale all utilities that you can effect down it changes nothing about actions (another confusion—mapping the utility to how much one cares).
And yes there are definitely cases where money are worth small probability of saving lives, and everyone agrees on such—e.g. if we find out that an asteroid has certain chance to hit Earth, we’d give money to space agencies, even when chance is rather minute (we’d not give money to cold fusion crackpots though). There’s nothing fundamentally wrong with spending a bit to avert a small probability of something terrible happening. The problem arises when the probability is overestimated, when the consequences are poorly evaluated, etc. It is actively harmful for example to encourage boys to cry wolf needlessly. I’m thinking people sort of feel innately that if they are giving money away—losing—some giant fairness fairy is going to make the result more likely good than bad for everyone. World doesn’t work like this; all those naive folks who jump on opportunity to give money to someone promising to save the world, no matter how ignorant, uneducated, or crackpotty that person is in the fields where correctness can be checked at all, they are increasing risk, not decreasing.
Maybe not as weird as all that. Given a forced choice between killing A and B where I know nothing about them, I flip a coin, but add the knowledge that A is a duplicate of C and B is not a duplicate of anyone, and I choose A quite easily. I conclude from this that I value unique human lives quite a lot more than I value non-unique human lives. As others have pointed out, the number of unique human lives is finite, and the number of lives I consider worth living necessarily even lower, so the more people there are living lives worth living, the less unique any individual is, and therefore the less I value any individual life. (Insofar as my values are consistent, anyway. Which of course they aren’t, but this whole “lets pretend” game of utility calculation that we enjoy playing depends on treating them as though they were.)