I’m not sure. I think even if the strong claim here is wrong and realism is coherent, it’s still fundamentally unknowable, and we can’t get any evidence at all in favor. That might be enough to doom altruism.
It’s hard for me to reason well about a concept I believe to be incoherent, though.
AFAIU, under strong(er?) verificationism, it’s also incoherent to say that your past and future selves exist. so all goals are doomed, not just altruistic ones.
alternatively, maybe if you merge all the minds, then you can verify other minds exist and take care of them. plus, maybe different part of your brain communicating isn’t qualitatively different from different brains communicating with each other (although it probably is).
I haven’t written specifically about goals, but being that claims about future experiences are coherent, preferences over the distribution of such are also, and one can act on their beliefs about how their actions affect said distribution. This doesn’t require the past to exist.
I’m not sure. I think even if the strong claim here is wrong and realism is coherent, it’s still fundamentally unknowable, and we can’t get any evidence at all in favor. That might be enough to doom altruism.
It’s hard for me to reason well about a concept I believe to be incoherent, though.
AFAIU, under strong(er?) verificationism, it’s also incoherent to say that your past and future selves exist. so all goals are doomed, not just altruistic ones.
alternatively, maybe if you merge all the minds, then you can verify other minds exist and take care of them. plus, maybe different part of your brain communicating isn’t qualitatively different from different brains communicating with each other (although it probably is).
I haven’t written specifically about goals, but being that claims about future experiences are coherent, preferences over the distribution of such are also, and one can act on their beliefs about how their actions affect said distribution. This doesn’t require the past to exist.