the view that there’s probably no persisting identity over time anyway and in some sense I probably die and get reborn all the time in any case
In the long run, this is probably true for humans in a strong sense that doesn’t depend on litigation of “personal identity” and “all the time”. A related phenomenon is value drift. Neural nets are not a safe medium for keeping a person alive for a very long time without losing themselves, physical immortality is insufficient to solve the problem.
That doesn’t mean that the problem isn’t worth solving, or that it can’t be solved. If AIs don’t killeveryone, immortality or uploading is an obvious ask. But avoidance of value drift or of unendorsed long term instability of one’s personality is less obvious. It’s unclear what the desirable changes should be, but it’s clear that there is an important problem here that hasn’t been explored.
In the long run, this is probably true for humans in a strong sense that doesn’t depend on litigation of “personal identity” and “all the time”. A related phenomenon is value drift. Neural nets are not a safe medium for keeping a person alive for a very long time without losing themselves, physical immortality is insufficient to solve the problem.
That doesn’t mean that the problem isn’t worth solving, or that it can’t be solved. If AIs don’t killeveryone, immortality or uploading is an obvious ask. But avoidance of value drift or of unendorsed long term instability of one’s personality is less obvious. It’s unclear what the desirable changes should be, but it’s clear that there is an important problem here that hasn’t been explored.
What if endorsed long term instability leads to negation of personal identity too? (That’s something I thought about.)