Ahhh, I see. I think that’s a bit misleading, I’d say “You have to care about what happens far away,” e.g. you have to want there to be paperclips far away also. (The current phrasing makes it seem like a paperclipper wouldn’t want to do ECL)
Also, technically, you don’t actually have to care about what happens far away either, if anthropic capture is involved.
I think you have to care about what happens to other agents. That might be “other paperclippers.”
If you only care about what happens to you personally, then I think the size of the universe isn’t relevant to your decision.
Ahhh, I see. I think that’s a bit misleading, I’d say “You have to care about what happens far away,” e.g. you have to want there to be paperclips far away also. (The current phrasing makes it seem like a paperclipper wouldn’t want to do ECL)
Also, technically, you don’t actually have to care about what happens far away either, if anthropic capture is involved.