and more specifically you should not find yourself personally living in a universe where the history of your experience is lost. I say this because this is evidence that we will likely avoid a failure in AI alignment that destroys us, or at least not find ourselves in a universe where AI destroys us all, because alignment will turn out to be practically easier than we expect it to be in theory.
Can you elaborate on this idea? What do you mean by ‘the history of your experience is lost’? Can you supply some links to read on this whole theory?
Can you elaborate on this idea? What do you mean by ‘the history of your experience is lost’? Can you supply some links to read on this whole theory?
Basically one way of understanding the consequences of https://en.wikipedia.org/wiki/Anthropic_principle and https://en.wikipedia.org/wiki/Many-worlds_interpretation