The brain constructed in your likeness is only normatively related to your brain. That’s the point I’m making. The step where you make a description of the brain is done according to a practice of representation. There is no causal relationship between the initial brain and the created brain. (Or, rather, any causal relationship is massively disperse through human society and history.) It’s a human being, or perhaps a computer programmed by human beings, in a cultural context with certain practices of representation, that creates the brain according to a set of rules.
This is obvious when you consider how the procedure might be developed. We would have to have a great many trial runs and would decide when we had got it right. That decision would be based on a set of normative criteria, a set of measurements. So it would only be “successful” according to a set of human norms. The procedure would be a cultural practice rather than a physical process. But there is just no such thing as something physical being “converted” or “transformed” into a description (or information or a pattern or representation) - because these are all normative concepts—so such a step cannot possibly conserve identity.
As I said, the only way the person in cryonic suspension can continue to live is through a standard process of revival—that is, one that doesn’t involve the step of being described and then having a likeness created—and if such a revival doesn’t occur, the person is dead. This is because the process of being described and then having a likeness created isn’t any sort of revival at all and couldn’t possibly be. It’s a logical impossibility.
My response to this is very simple, but it’s necessary to know beforehand that the brain’s operation is robust to many low-level variations, e.g., thermal noise that triggers occasional random action potentials at a low rate.
We would have to have a great many trial runs and would decide when we had got it right.
Suppose our standard is that we get it right when the reconstructed brain is more like the original brain just before cryonic preservation than a brain after a good night’s sleep is like that same brain before sleeping—within the subset of brain features that are not robust to variation. Further suppose that that standard is achieved through a process that involves a representation of the structure of the brain. Albeit that the representation is indeed a “cultural practice”, the brute fact of the extreme degree of similarity of the pre- and post-process brains would seem much more relevant to the question of preservation of any aspect of the brain worthy of being called “identity”.
ETA: Thinking about this a bit more, I see that the notion of “similarity” in the above argument is also vulnerable to the charge of being a mere cultural practice. So let me clarify that the kind of similarity I have in mind basically maps to reproducibility of the input-output relation of a low-level functional unit, up to, say, the magnitude of thermal noise. Reproducibility in this sense has empirical content; it is not merely culturally constructed.
I don’t see how using more detailed measurements makes it any less a cultural practice. There isn’t a limit you can pass where doing something according to a standard suddenly becomes a physical relationship. Regardless, consider that you could create as many copies to that standard as you wished, so you now have a one-to-many relationship of “identity” according to your scenario. Such a type-token relationship is typical of norm-based standards (such as mediums of representation) because they are norm-based standards (that is, because you can make as many according to the standard as you wish).
I don’t see how using more detailed measurements makes it any less a cultural practice.
I’m not saying it’s not a cultural practice. I’m saying that the brute fact of the extreme degree of similarity (and resulting reproducibility of functionality) of the pre- and post-process brains seems like a much more relevant fact. I don’t know why I should care that the process is a cultural artifact if the pre- and post-process brains are so similar that for all possible inputs, they produce the same outputs. That I can get more brains out than I put in is a feature, not a bug, even though it makes the concept of a singular identity obsolete.
The brain constructed in your likeness is only normatively related to your brain. That’s the point I’m making. The step where you make a description of the brain is done according to a practice of representation. There is no causal relationship between the initial brain and the created brain. (Or, rather, any causal relationship is massively disperse through human society and history.) It’s a human being, or perhaps a computer programmed by human beings, in a cultural context with certain practices of representation, that creates the brain according to a set of rules.
This is obvious when you consider how the procedure might be developed. We would have to have a great many trial runs and would decide when we had got it right. That decision would be based on a set of normative criteria, a set of measurements. So it would only be “successful” according to a set of human norms. The procedure would be a cultural practice rather than a physical process. But there is just no such thing as something physical being “converted” or “transformed” into a description (or information or a pattern or representation) - because these are all normative concepts—so such a step cannot possibly conserve identity.
As I said, the only way the person in cryonic suspension can continue to live is through a standard process of revival—that is, one that doesn’t involve the step of being described and then having a likeness created—and if such a revival doesn’t occur, the person is dead. This is because the process of being described and then having a likeness created isn’t any sort of revival at all and couldn’t possibly be. It’s a logical impossibility.
My response to this is very simple, but it’s necessary to know beforehand that the brain’s operation is robust to many low-level variations, e.g., thermal noise that triggers occasional random action potentials at a low rate.
Suppose our standard is that we get it right when the reconstructed brain is more like the original brain just before cryonic preservation than a brain after a good night’s sleep is like that same brain before sleeping—within the subset of brain features that are not robust to variation. Further suppose that that standard is achieved through a process that involves a representation of the structure of the brain. Albeit that the representation is indeed a “cultural practice”, the brute fact of the extreme degree of similarity of the pre- and post-process brains would seem much more relevant to the question of preservation of any aspect of the brain worthy of being called “identity”.
ETA: Thinking about this a bit more, I see that the notion of “similarity” in the above argument is also vulnerable to the charge of being a mere cultural practice. So let me clarify that the kind of similarity I have in mind basically maps to reproducibility of the input-output relation of a low-level functional unit, up to, say, the magnitude of thermal noise. Reproducibility in this sense has empirical content; it is not merely culturally constructed.
I don’t see how using more detailed measurements makes it any less a cultural practice. There isn’t a limit you can pass where doing something according to a standard suddenly becomes a physical relationship. Regardless, consider that you could create as many copies to that standard as you wished, so you now have a one-to-many relationship of “identity” according to your scenario. Such a type-token relationship is typical of norm-based standards (such as mediums of representation) because they are norm-based standards (that is, because you can make as many according to the standard as you wish).
I’m not saying it’s not a cultural practice. I’m saying that the brute fact of the extreme degree of similarity (and resulting reproducibility of functionality) of the pre- and post-process brains seems like a much more relevant fact. I don’t know why I should care that the process is a cultural artifact if the pre- and post-process brains are so similar that for all possible inputs, they produce the same outputs. That I can get more brains out than I put in is a feature, not a bug, even though it makes the concept of a singular identity obsolete.