Wait, why is ECL lumped under Correlation + Kindness instead of just Correlation? I think this thread is supposed to answer that question but I don’t get it.
It’s not true that you only have an ECL reason to cooperate if you care about the survival of other agents. Paperclippers, for example, have ECL reason to cooperate.
Ahhh, I see. I think that’s a bit misleading, I’d say “You have to care about what happens far away,” e.g. you have to want there to be paperclips far away also. (The current phrasing makes it seem like a paperclipper wouldn’t want to do ECL)
Also, technically, you don’t actually have to care about what happens far away either, if anthropic capture is involved.
Wait, why is ECL lumped under Correlation + Kindness instead of just Correlation? I think this thread is supposed to answer that question but I don’t get it.
It’s not true that you only have an ECL reason to cooperate if you care about the survival of other agents. Paperclippers, for example, have ECL reason to cooperate.
I think you have to care about what happens to other agents. That might be “other paperclippers.”
If you only care about what happens to you personally, then I think the size of the universe isn’t relevant to your decision.
Ahhh, I see. I think that’s a bit misleading, I’d say “You have to care about what happens far away,” e.g. you have to want there to be paperclips far away also. (The current phrasing makes it seem like a paperclipper wouldn’t want to do ECL)
Also, technically, you don’t actually have to care about what happens far away either, if anthropic capture is involved.