The immediate benefit of helping others now seems to considerably outweigh the selfish act of self-preservation
Actually, I believe there is an interesting case to be made that brain preservation has immense public goods value.
The actual process of future resurrection—if possible—will revolve around statistical inference; it will necessarily involve a large amount of informed simulation/induction on the part of future AI.
The human cortex contains a model of the universe from the perspective of one observer, and other humans/agents are the most complex objects our brains must model. So the key information content of one particular human mind is not localized to a particular brain—it is instead distributed across many brains.
I mean that the physical information which defines—or alternatively is required to reconstruct—a human mind is not strictly localized in space to the confines of a single brain.
Using the hardware/software analogy, the brain is the hardware, the mind is the software, but the mind is distributed software: each mind program runs mainly on a single brain, but it also has partial cached copies distributed on other brains.
For example, if two people spend a bunch of time together, they are going to have many shared memories. Later if both die and the brain of one is preserved, the shared memories are useful for constructing both minds. With many preserved brains, you get multiple viewpoints for many overlapping memories which allow for more precise reconstruction.
Actually, I believe there is an interesting case to be made that brain preservation has immense public goods value.
The actual process of future resurrection—if possible—will revolve around statistical inference; it will necessarily involve a large amount of informed simulation/induction on the part of future AI.
The human cortex contains a model of the universe from the perspective of one observer, and other humans/agents are the most complex objects our brains must model. So the key information content of one particular human mind is not localized to a particular brain—it is instead distributed across many brains.
I’m not sure I understand. What do you mean when you say this:
Are you saying that that the universe is part all minds, not just one particular persons? Can you explain what you mean more clearly?
I mean that the physical information which defines—or alternatively is required to reconstruct—a human mind is not strictly localized in space to the confines of a single brain.
Using the hardware/software analogy, the brain is the hardware, the mind is the software, but the mind is distributed software: each mind program runs mainly on a single brain, but it also has partial cached copies distributed on other brains.
For example, if two people spend a bunch of time together, they are going to have many shared memories. Later if both die and the brain of one is preserved, the shared memories are useful for constructing both minds. With many preserved brains, you get multiple viewpoints for many overlapping memories which allow for more precise reconstruction.
I’m a little disturbed by the thought of reconstructing my personality from others’ impressions of my personality.