I read that earlier, and it doesn’t answer the question. If you believe that the second copy in your scenario is different from the first copy in some deep existential sense at the time of division (equivalently, that personhood corresponds to something other than unique brain state), you’ve already assumed a conclusion to all questions along these lines—and in fact gone past all questions of risk of death and into certainty.
But you haven’t provided any reasoning for that belief: you’ve just outlined the consequences of it from several different angles.
Here’s why I conclude a risk exists: http://lesswrong.com/lw/b9/welcome_to_less_wrong/5huo?context=1#5huo
I read that earlier, and it doesn’t answer the question. If you believe that the second copy in your scenario is different from the first copy in some deep existential sense at the time of division (equivalently, that personhood corresponds to something other than unique brain state), you’ve already assumed a conclusion to all questions along these lines—and in fact gone past all questions of risk of death and into certainty.
But you haven’t provided any reasoning for that belief: you’ve just outlined the consequences of it from several different angles.