I agree with you to the extent that no one that I am aware of is actually expending the effort that disutilities represented by 10^23 should inspire. But even before the concept of cosmic waste was developed, no one was actually working as hard as, say, starvation in Africa deserved. Or ending aging. Or the threat of nuclear Armageddon. But the fact that humans, who are all affected by akrasia aren’t actually doing what they want isn’t really strong evidence that it isn’t what they, on sufficient reflection, want. Utility is not a model of what non-rational agents (ie humans) are doing, it is a model of how actual, idealized agents want to act. I don’t want people to die, so I should work to reduce existential risk as much as possible, but because I am not a perfect agent, I can’t actually follow the path that really maximizes my (non-existent abstraction of) utility.
I agree with you to the extent that no one that I am aware of is actually expending the effort that disutilities represented by 10^23 should inspire. But even before the concept of cosmic waste was developed, no one was actually working as hard as, say, starvation in Africa deserved. Or ending aging. Or the threat of nuclear Armageddon. But the fact that humans, who are all affected by akrasia aren’t actually doing what they want isn’t really strong evidence that it isn’t what they, on sufficient reflection, want. Utility is not a model of what non-rational agents (ie humans) are doing, it is a model of how actual, idealized agents want to act. I don’t want people to die, so I should work to reduce existential risk as much as possible, but because I am not a perfect agent, I can’t actually follow the path that really maximizes my (non-existent abstraction of) utility.