Why have infinite willpower if not to use it to satisfy preferences? Smell the roses, play games, chat with friends. The only reason not to do this on any large scale is if there was something you had to do now that could have huge returns later. Code a self-improving AI, discover immortality, that sort of thing. However, even with infinite willpower I don’t think everyone is cut out for that, so for most people I’d say make enough money to hit diminishing returns on investing it in research, invest it in research, and live the good life.
While you smell the roses, 100 people die horrible painful deaths. What would you like to do next?
My point is, until the Singularity (if then) other people’s suffering will outweigh rose-smelling by people who can control their emotions at will anyway (and by other people), even if they’ve invested in research to zero marginal utility.
Just because people are dying doesn’t mean you shouldn’t do a cost/benefit calculation. Which sounds terrible until I bet you haven’t donated everything you can to charity. Now it just sounds human to me.
So any pleasure is considered luxury until death is eradicated, and finite willpower is the only apology that can justify not concentrating all one’s efforts to fighting against death?
Imagine that aging was curable and you were essentially immortal—only there was an annual chance of 10^(-8) that you will die painful death by some accident. Would you forgo all trivial pleasures, if that spared you from the risk? In such a world, with present population of order 10^10, 100 people would still die painful deaths each year.
I would try to figure out how painful the deaths are and how trivial the pleasures are.
I don’t think this world’s an edge case yet though—decreasing pleasure now seems like it would increase long-term expected pleasure, e.g. by working more and using the money to make FAI more likely, or giving it to any effective charity.
Why have infinite willpower if not to use it to satisfy preferences? Smell the roses, play games, chat with friends. The only reason not to do this on any large scale is if there was something you had to do now that could have huge returns later. Code a self-improving AI, discover immortality, that sort of thing. However, even with infinite willpower I don’t think everyone is cut out for that, so for most people I’d say make enough money to hit diminishing returns on investing it in research, invest it in research, and live the good life.
While you smell the roses, 100 people die horrible painful deaths. What would you like to do next?
My point is, until the Singularity (if then) other people’s suffering will outweigh rose-smelling by people who can control their emotions at will anyway (and by other people), even if they’ve invested in research to zero marginal utility.
Just because people are dying doesn’t mean you shouldn’t do a cost/benefit calculation. Which sounds terrible until I bet you haven’t donated everything you can to charity. Now it just sounds human to me.
So any pleasure is considered luxury until death is eradicated, and finite willpower is the only apology that can justify not concentrating all one’s efforts to fighting against death?
Imagine that aging was curable and you were essentially immortal—only there was an annual chance of 10^(-8) that you will die painful death by some accident. Would you forgo all trivial pleasures, if that spared you from the risk? In such a world, with present population of order 10^10, 100 people would still die painful deaths each year.
I would try to figure out how painful the deaths are and how trivial the pleasures are.
I don’t think this world’s an edge case yet though—decreasing pleasure now seems like it would increase long-term expected pleasure, e.g. by working more and using the money to make FAI more likely, or giving it to any effective charity.
“Enough to hit diminishing returns” doesn’t mean anything until you specify how strongly diminishing.
Redacted because I misread Manfred’s comment the first time.
After a while of doing that, you will no longer need willpower to prevent you from chatting with friends.