I find that I often get hung up on background assumptions that go into human capital creation in these hypothetical copy-of-a-person scenarios.
One model here is that I should anticipate with near certainty having a few random thoughts, then ending up in Hawaii for a while and then ceasing to exist(!), and neither forming new memories nor reminiscing about previous events. In other words I should anticipate imminent metaphysically imposed death right after an ironically pleasant and existential-fear-tinged vacation.
The reason is that CPU will still cost something and presumably the version of me that has to pay for those cycles will not want to have 1000 dependents (or worse still 1000 people with legitimate claims to 1/1001th of my upload self’s wealth) that drain her CPU budget.
Another model is that I should “anticipate remembering” being cold for a while, then running a bunch of close-ended sims whose internal contents mattered almost not at all to the rest of the world. Then I’d shut them down, feel the oath to be fulfilled, and wonder what the point was exactly. Then for the next 1000 subjective years I would get to reminisce about how naive I was about how resource constraints, subjective experiences, and memories of subjective experiences interrelate.
Those scenarios paper over a lot of details. Are the sims deterministic with identical seeds? I think then my answer is a roughly 50⁄50 expectation, with 999 of the sims being a total waste of joules.
Are the sims going to be integrated with my future self’s memory somehow? That seems like it would be a big deal to my “real self” and involve making her dumber (mentally averaged with either 1 or 1000 people like me who lacks many of her recent and coolest memories).
And so on.
The key theme here is that “human capital” is normally assumed to be “a thing”. These thought experiments break the assumption and this matters. Souls (rooted in bodies or in the future just in the cloud) are more or less cultivated. Creating new souls has historically been very costly and involved economic consequences. The game has real stakes, and the consequences of things like brain surgery resonate for a long time afterwards. If something doesn’t have “consequential resonance” then it is in a sort of “pocket universe” that can be safely discounted from the perspective of the causal domains, like ours, which are much vaster but also relatively tightly constrained.
Possibly the scariest upload/sim scenario is that the available clock cycles are functionally infinite, and you can spawn a trillion new selves every subjective second, for centuries, and it’s all just a drop in the bucket. None of the souls matter. All of them are essentially expendable. They re-merge and have various diff-resolution-problems but it doesn’t matter because there are a trillion trillion backups that are mostly not broken and its not like you can’t just press the “human virtue” button to be randomly overwritten with a procedurally generated high quality soul. Whenever you meet someone else they have their own unique 1-in-10^50-sort-of-brokeness and it doesn’t even seem weird because that’s how nearly everyone is. This is the “human capital is no longer a thing” scenario, and I think (but am not absolutely sure) that it would be a dystopian outcome.
I find that I often get hung up on background assumptions that go into human capital creation in these hypothetical copy-of-a-person scenarios.
One model here is that I should anticipate with near certainty having a few random thoughts, then ending up in Hawaii for a while and then ceasing to exist(!), and neither forming new memories nor reminiscing about previous events. In other words I should anticipate imminent metaphysically imposed death right after an ironically pleasant and existential-fear-tinged vacation.
The reason is that CPU will still cost something and presumably the version of me that has to pay for those cycles will not want to have 1000 dependents (or worse still 1000 people with legitimate claims to 1/1001th of my upload self’s wealth) that drain her CPU budget.
Another model is that I should “anticipate remembering” being cold for a while, then running a bunch of close-ended sims whose internal contents mattered almost not at all to the rest of the world. Then I’d shut them down, feel the oath to be fulfilled, and wonder what the point was exactly. Then for the next 1000 subjective years I would get to reminisce about how naive I was about how resource constraints, subjective experiences, and memories of subjective experiences interrelate.
Those scenarios paper over a lot of details. Are the sims deterministic with identical seeds? I think then my answer is a roughly 50⁄50 expectation, with 999 of the sims being a total waste of joules.
Are the sims going to be integrated with my future self’s memory somehow? That seems like it would be a big deal to my “real self” and involve making her dumber (mentally averaged with either 1 or 1000 people like me who lacks many of her recent and coolest memories).
And so on.
The key theme here is that “human capital” is normally assumed to be “a thing”. These thought experiments break the assumption and this matters. Souls (rooted in bodies or in the future just in the cloud) are more or less cultivated. Creating new souls has historically been very costly and involved economic consequences. The game has real stakes, and the consequences of things like brain surgery resonate for a long time afterwards. If something doesn’t have “consequential resonance” then it is in a sort of “pocket universe” that can be safely discounted from the perspective of the causal domains, like ours, which are much vaster but also relatively tightly constrained.
Possibly the scariest upload/sim scenario is that the available clock cycles are functionally infinite, and you can spawn a trillion new selves every subjective second, for centuries, and it’s all just a drop in the bucket. None of the souls matter. All of them are essentially expendable. They re-merge and have various diff-resolution-problems but it doesn’t matter because there are a trillion trillion backups that are mostly not broken and its not like you can’t just press the “human virtue” button to be randomly overwritten with a procedurally generated high quality soul. Whenever you meet someone else they have their own unique 1-in-10^50-sort-of-brokeness and it doesn’t even seem weird because that’s how nearly everyone is. This is the “human capital is no longer a thing” scenario, and I think (but am not absolutely sure) that it would be a dystopian outcome.