In particular, you appear to assume that we care about events outside of our lightcone in roughly the way we care about events in our near future. I’m guessing a good deal of skepticism of ECL is a result of people not caring much about distant events.
Yeah, you’re right that we assume that you care about what’s going on outside the lightcone! If that’s not the case (or only a little bit the case), that would limit the action-relevance of ECL.
(That said, there might be some weird simulations-shenanigans or cooperating with future earth-AI that would still make you care about ECL to some extent although my best guess is that they shouldn’t move you too much. This is not really my focus though and I haven’t properly thought through ECL for people with indexical values.)
Doesn’t this depend on what we value?
In particular, you appear to assume that we care about events outside of our lightcone in roughly the way we care about events in our near future. I’m guessing a good deal of skepticism of ECL is a result of people not caring much about distant events.
Yeah, you’re right that we assume that you care about what’s going on outside the lightcone! If that’s not the case (or only a little bit the case), that would limit the action-relevance of ECL.
(That said, there might be some weird simulations-shenanigans or cooperating with future earth-AI that would still make you care about ECL to some extent although my best guess is that they shouldn’t move you too much. This is not really my focus though and I haven’t properly thought through ECL for people with indexical values.)