But this is a slippery slope. If my recreation is exactly like me except for one neuron, is he the same person? Signs point to yes. What about five neurons? Five million? Or on a functional level, what if he blinked at exactly one point where I would not have done so? What if he prefers a different flavor of ice cream? What if he has exactly the same memories as I do, except for the outcome of one first-grade spelling bee I haven’t thought about in years anyway? What if he is a Hindu fundamentalist?
These questions apply equally to the person who wakes up as Yvain tomorrow. Are you still “you” after a day’s worth of loss of neurons in the ordinary course of things? After a minor stroke? After brain surgery? After a religious conversion? After changing in any way at all, including by reading this comment?
I’ve never seen the concept of degrees of identity formalised in a modal logic of possible worlds. The modal logics I’ve seen all consider each entity to either exist, or not exist, in each possible world. One can make toy mathematical systems out of this, but what practical use they are is less clear.
These questions apply equally to the person who wakes up as Yvain tomorrow. Are you still “you” after a day’s worth of loss of neurons in the ordinary course of things? After a minor stroke? After brain surgery? After a religious conversion? After changing in any way at all, including by reading this comment?
I’ve never seen the concept of degrees of identity formalised in a modal logic of possible worlds. The modal logics I’ve seen all consider each entity to either exist, or not exist, in each possible world. One can make toy mathematical systems out of this, but what practical use they are is less clear.