It’s not you that’s “losing utility”, it is any agent that has linearly aggregative utility in human lives lived. If you’re not an altruist in this sense, then you don’t care.
No one has ever been an altruist in this crazy sense. No one’s actual wants and desires have ever been adequately represented by this 10^23 stuff. Utility is a model of what people want, not a prescription of what you “should” want (what does “should want” mean anyway?), and here we clearly see the model not modeling what it’s supposed to.
I agree with you to the extent that no one that I am aware of is actually expending the effort that disutilities represented by 10^23 should inspire. But even before the concept of cosmic waste was developed, no one was actually working as hard as, say, starvation in Africa deserved. Or ending aging. Or the threat of nuclear Armageddon. But the fact that humans, who are all affected by akrasia aren’t actually doing what they want isn’t really strong evidence that it isn’t what they, on sufficient reflection, want. Utility is not a model of what non-rational agents (ie humans) are doing, it is a model of how actual, idealized agents want to act. I don’t want people to die, so I should work to reduce existential risk as much as possible, but because I am not a perfect agent, I can’t actually follow the path that really maximizes my (non-existent abstraction of) utility.
No one’s actual wants and desires have ever been adequately represented by this 10^23 stuff.
Can you expand on this? What do you mean by “actual” wants? If someone claims to be motivated by “10^23 stuff”, and acts in accordance with this claim, then what is your account of their “actual wants”?
I haven’t seen anyone who claims to be motivated by utilities of such magnitude except Eliezer. He’s currently busy writing his Harry Potter fanfic and shows no signs of mental distress that the 10^23-strong anticipation should’ve given him.
Now this story has a plot, an arc, and a direction, but it does not have a set pace. What it has are chapters that are fun to write. I started writing this story in part because I’d bogged down on a book I was working on (now debogged), and that means my top priority was to have fun writing again.
The other reason is that Eliezer Yudkowsky showed up here on Monday, seeking people’s help with the rationality book he’s writing. Previously, he wrote a number of immensly high-quality posts in blog format, with the express purpose of turning them into a book later on. But now that he’s been trying to work on the book, he has noticed that without the constant feedback he got from writing blog posts, getting anything written has been very slow. So he came here to see if having people watching him write and providing feedback at the same time would help. He did get some stuff written, and at the end, asked me if I could come over his place on Wednesday. (I’m not entirely sure of why I in particular was picked, but hey.) On Wednesday, me being there helped him break his previous daily record on amount of words written for his book, so I visited again on Friday and agreed to also come back on Monday and Tuesday.
Eliezer is not “busy writing his Harry Potter fanfic.” He is working on his book on rationality.
It’s not you that’s “losing utility”, it is any agent that has linearly aggregative utility in human lives lived. If you’re not an altruist in this sense, then you don’t care.
No one has ever been an altruist in this crazy sense. No one’s actual wants and desires have ever been adequately represented by this 10^23 stuff. Utility is a model of what people want, not a prescription of what you “should” want (what does “should want” mean anyway?), and here we clearly see the model not modeling what it’s supposed to.
I agree with you to the extent that no one that I am aware of is actually expending the effort that disutilities represented by 10^23 should inspire. But even before the concept of cosmic waste was developed, no one was actually working as hard as, say, starvation in Africa deserved. Or ending aging. Or the threat of nuclear Armageddon. But the fact that humans, who are all affected by akrasia aren’t actually doing what they want isn’t really strong evidence that it isn’t what they, on sufficient reflection, want. Utility is not a model of what non-rational agents (ie humans) are doing, it is a model of how actual, idealized agents want to act. I don’t want people to die, so I should work to reduce existential risk as much as possible, but because I am not a perfect agent, I can’t actually follow the path that really maximizes my (non-existent abstraction of) utility.
Can you expand on this? What do you mean by “actual” wants? If someone claims to be motivated by “10^23 stuff”, and acts in accordance with this claim, then what is your account of their “actual wants”?
I haven’t seen anyone who claims to be motivated by utilities of such magnitude except Eliezer. He’s currently busy writing his Harry Potter fanfic and shows no signs of mental distress that the 10^23-strong anticipation should’ve given him.
From the Author’s Note:
From Kaj Sotala:
Eliezer is not “busy writing his Harry Potter fanfic.” He is working on his book on rationality.
The Harry Potter fanfic is a book on rationality. And a damn good one.
To clarify, Eliezer Yudkowsky is working both on a book and on the Harry Potter fanfiction in question. Both pertain to rationality.