The fact that you considered that parent/kid question to be a valid argument, indicates strongly to me that you don’t have the mindset or understanding to make copies safely.
IMHO, you shouldn’t be allowed to make copies of yourself unless you’re willing to suicide and let it take your place. People unable to do that lack the mindset to properly manage copy creation and destruction.
Sexual reproduction is a form of reproduction. Anyone who is a parent knows that children are a limited means of carrying identity in the form of drives, goals, likes & dislikes, etc. in to the future, even if vicariously (both because of your influence on them, and their influence on you). If inputs/outputs are all that matter in determining identity, then identity is a fuzzy concept and a continuous scale, as we are all constantly changing. Your children carry on some part of your personal identity, even if in nothing but their internal simulations of you. The same arguments apply.
If we’re going to talk about societal proscriptions, then I would say those who think their sentient creations should be prepared to commit suicide for any reason are the ones who shouldn’t be dabbling in creation...
There is no such thing as a perfect copy. That’s what the OP is about! Even if there were some sort of magical philosophy box that made perfect replicas, you would cease to be perfect copies of each other as soon as you exited the box and started receiving different percepts—you would become different physical sentient entities leading separate lives. If you want to believe that these two clones are in fact the same identity, then you have to provide a specific reason—for example: related histories, similarity of behavior, motivation & drives, etc. Furthermore it would have to be a fuzzy comparison because as soon as you exit the box you start to diverge. How much change does it take until you can no longer claim that you and your clone are the same person? A week? A year? One hundred years? At that point you and your clone will have lived move time separately than your shared history. Do you still have the right to claim the other as a direct extension of yourself? What if a million years pass? I am quite confident that in a million years, you will have less in common with your clone than you currently do with your own children (assuming you have children).
So no, it’s not a strawman. It’s a direct conclusion from where your reasoning leads. And when a line of reasoning leads to absurd outcomes, it’s often time to revisit the underlying assumptions.
This looks like an argument for extreme time preference, not an argument against copies. Why identify with one million-years-later version of yourself and exclude the other, unless we beg the question?
That’s what I’m saying. I myself wouldn’t identify with any of the copies, no mater how near or distant. My clone and I have a lot in common, but were are separate sentient beings (hence: requesting suicide of the other is tantamount to murder). But if you do identify with clones (as in: they are you, not merely other beings that are similar to you), then at some point you and they must cross the line of divergence where they no longer are identifiable, or else the argument reduces to absurdity. Where is that line? I see no non-arbitrary way of defining it.
EDIT: which led me to suspect that other than intuition I have no reason to think that my clone and I share the same identity, which led me to consider other models for consciousness and identity. My terseness isn’t just because of the moral repugnance of asking others to suicide, but also because this is an old, already hashed argument. I first encountered it in philosophy class 10+ years ago. If there is a formal response to the reduction to absurdity I gave (which doesn’t also throw out consciousness entirely), I have yet to see it.
Maybe you already got this part, but time preference is orthogonal to copies vs originals.
Eliezer says he defines personal identity in part by causal connections, which exist between you and the “clone” as well as between you and your “original” in the future. This definition also suggests a hole in your argument for strong time preference.
You are misreading me. I don’t have time preference. If an exact perfect replica of me were made, it would not be me even at the moment of duplication.
I have continuation-of-computation preference. This is much stricter than Eliezer’s causal connection based identity, but also avoids many weird predictions which arise from that.
And yes, you would need a bright line in this case. Fuzziness is in the map, not the territory on this item.
The fact that you considered that parent/kid question to be a valid argument, indicates strongly to me that you don’t have the mindset or understanding to make copies safely.
How does it not follow from what you said?
Sexual reproduction is a form of reproduction. Anyone who is a parent knows that children are a limited means of carrying identity in the form of drives, goals, likes & dislikes, etc. in to the future, even if vicariously (both because of your influence on them, and their influence on you). If inputs/outputs are all that matter in determining identity, then identity is a fuzzy concept and a continuous scale, as we are all constantly changing. Your children carry on some part of your personal identity, even if in nothing but their internal simulations of you. The same arguments apply.
If we’re going to talk about societal proscriptions, then I would say those who think their sentient creations should be prepared to commit suicide for any reason are the ones who shouldn’t be dabbling in creation...
Yes, sexual reproduction is a form of reproduction, one which we were explicitly not talking about. We were talking about perfect copies.
You may continue beating at the straw man if you wish, but don’t expect me to respond.
There is no such thing as a perfect copy. That’s what the OP is about! Even if there were some sort of magical philosophy box that made perfect replicas, you would cease to be perfect copies of each other as soon as you exited the box and started receiving different percepts—you would become different physical sentient entities leading separate lives. If you want to believe that these two clones are in fact the same identity, then you have to provide a specific reason—for example: related histories, similarity of behavior, motivation & drives, etc. Furthermore it would have to be a fuzzy comparison because as soon as you exit the box you start to diverge. How much change does it take until you can no longer claim that you and your clone are the same person? A week? A year? One hundred years? At that point you and your clone will have lived move time separately than your shared history. Do you still have the right to claim the other as a direct extension of yourself? What if a million years pass? I am quite confident that in a million years, you will have less in common with your clone than you currently do with your own children (assuming you have children).
So no, it’s not a strawman. It’s a direct conclusion from where your reasoning leads. And when a line of reasoning leads to absurd outcomes, it’s often time to revisit the underlying assumptions.
This looks like an argument for extreme time preference, not an argument against copies. Why identify with one million-years-later version of yourself and exclude the other, unless we beg the question?
That’s what I’m saying. I myself wouldn’t identify with any of the copies, no mater how near or distant. My clone and I have a lot in common, but were are separate sentient beings (hence: requesting suicide of the other is tantamount to murder). But if you do identify with clones (as in: they are you, not merely other beings that are similar to you), then at some point you and they must cross the line of divergence where they no longer are identifiable, or else the argument reduces to absurdity. Where is that line? I see no non-arbitrary way of defining it.
EDIT: which led me to suspect that other than intuition I have no reason to think that my clone and I share the same identity, which led me to consider other models for consciousness and identity. My terseness isn’t just because of the moral repugnance of asking others to suicide, but also because this is an old, already hashed argument. I first encountered it in philosophy class 10+ years ago. If there is a formal response to the reduction to absurdity I gave (which doesn’t also throw out consciousness entirely), I have yet to see it.
We certainly don’t need a bright line.
Maybe you already got this part, but time preference is orthogonal to copies vs originals.
Eliezer says he defines personal identity in part by causal connections, which exist between you and the “clone” as well as between you and your “original” in the future. This definition also suggests a hole in your argument for strong time preference.
You are misreading me. I don’t have time preference. If an exact perfect replica of me were made, it would not be me even at the moment of duplication.
I have continuation-of-computation preference. This is much stricter than Eliezer’s causal connection based identity, but also avoids many weird predictions which arise from that.
And yes, you would need a bright line in this case. Fuzziness is in the map, not the territory on this item.