Within this framework, whether or not you “feel that continuity” would mostly be a fact about the ontology your mindstate uses thinking about teleportation. Everything in this post could be accurate and none of it would be incompatible with you having an existential crisis upon being teleported, freaking out upon meeting yourself, etc.
Nor does anything here seem to make a value judgement about what the copy of you should do if told they’re not allowed to exist. Attempting revolution seems like a perfectly valid response; self defense is held as a fairly basic human right after all. (I’m shocked that isn’t already the plot of a sci-fi story.)
It would also be entirely possible for both of your copies to hold conviction that they’re the one true you—Their experiences from where they sit being entirely compatible with that belief. (Definitely the plot of at least one Star Trek episode.)
There’s not really any pressure currently to have thinking about mind copying that’s consistent with every piece of technology that could ever conceivably be built. There’s nothing that forces minds to have accurate beliefs about anything that won’t kill them or wouldn’t have killed their ancestors in fairly short order. Which is to say mostly that we shouldn’t expect to get accurate beliefs about weird hypotheticals often without having changed our minds at least once.
Within this framework, whether or not you “feel that continuity” would mostly be a fact about the ontology your mindstate uses thinking about teleportation. Everything in this post could be accurate and none of it would be incompatible with you having an existential crisis upon being teleported, freaking out upon meeting yourself, etc.
Nor does anything here seem to make a value judgement about what the copy of you should do if told they’re not allowed to exist. Attempting revolution seems like a perfectly valid response; self defense is held as a fairly basic human right after all. (I’m shocked that isn’t already the plot of a sci-fi story.)
It would also be entirely possible for both of your copies to hold conviction that they’re the one true you—Their experiences from where they sit being entirely compatible with that belief. (Definitely the plot of at least one Star Trek episode.)
There’s not really any pressure currently to have thinking about mind copying that’s consistent with every piece of technology that could ever conceivably be built. There’s nothing that forces minds to have accurate beliefs about anything that won’t kill them or wouldn’t have killed their ancestors in fairly short order. Which is to say mostly that we shouldn’t expect to get accurate beliefs about weird hypotheticals often without having changed our minds at least once.