Hmm. What I had in mind isn’t something I would call brain surgery. It would be closer to a drug. My idea (pretty much an “outlook” from Egan’s Diaspora) is that your mind would be running in software, in a huge neuron simulator, and that the tech would simply inhibit the output of certain, targeted networks in your brain or enhance others. This would obviously be much more targeted than inert drugs could achieve. (I guess you might be able to achieve this in a physical brain with nanotech.)
I’m not sure if this changes your intuition any. Perhaps you would still be uncomfortable with it without understanding it first. But if you trust the people who would be reviving you to not torture and enslave you, you could conceivably leave enough detailed information about your preferences for you to trust them as a first-cut proxy on the mind modification decision. (Though that could easily be infeasible.) Or perhaps you could instruct them to extrapolate from your brain whether you would eventually approve of the modification, if the extrapolation wouldn’t create a sentient copy of you. (I’m not sure if that’s possible, but it might be.)
I trust the inhabitants of the future not to torture and enslave me. I don’t trust them not to be well-intentioned evil utilitarians who think nothing of overriding my instructions and preferences if that will make me happy. So I’d like to have the resources to be happy without anybody having to be evil to me.
But that wouldn’t be making you happy. It’d be making someone very much like you happy, but someone you wouldn’t have ever matured into. (You may still care that the latter person isn’t created, or not want to pay for cryonics just for the latter person to be created; that’s not the point.) I doubt that people in the future will have so much disregard for personal identity and autonomy that they would make such modifications to you. Do you think they would prevent someone from committing suicide? If they would make unwanted modifications to you before reviving you, why wouldn’t they be willing to make modifications to unconsenting living people*? They would see your “do not revive unless...” instructions as a suicide note.
* Perhaps because they view you as a lower life form for which more paternalism is warranted than for normal transhuman.
Of course that’s not a strong argument. If you want to be that cautious, you can.
Hmm. What I had in mind isn’t something I would call brain surgery. It would be closer to a drug. My idea (pretty much an “outlook” from Egan’s Diaspora) is that your mind would be running in software, in a huge neuron simulator, and that the tech would simply inhibit the output of certain, targeted networks in your brain or enhance others. This would obviously be much more targeted than inert drugs could achieve. (I guess you might be able to achieve this in a physical brain with nanotech.)
I’m not sure if this changes your intuition any. Perhaps you would still be uncomfortable with it without understanding it first. But if you trust the people who would be reviving you to not torture and enslave you, you could conceivably leave enough detailed information about your preferences for you to trust them as a first-cut proxy on the mind modification decision. (Though that could easily be infeasible.) Or perhaps you could instruct them to extrapolate from your brain whether you would eventually approve of the modification, if the extrapolation wouldn’t create a sentient copy of you. (I’m not sure if that’s possible, but it might be.)
I trust the inhabitants of the future not to torture and enslave me. I don’t trust them not to be well-intentioned evil utilitarians who think nothing of overriding my instructions and preferences if that will make me happy. So I’d like to have the resources to be happy without anybody having to be evil to me.
But that wouldn’t be making you happy. It’d be making someone very much like you happy, but someone you wouldn’t have ever matured into. (You may still care that the latter person isn’t created, or not want to pay for cryonics just for the latter person to be created; that’s not the point.) I doubt that people in the future will have so much disregard for personal identity and autonomy that they would make such modifications to you. Do you think they would prevent someone from committing suicide? If they would make unwanted modifications to you before reviving you, why wouldn’t they be willing to make modifications to unconsenting living people*? They would see your “do not revive unless...” instructions as a suicide note.
* Perhaps because they view you as a lower life form for which more paternalism is warranted than for normal transhuman.
Of course that’s not a strong argument. If you want to be that cautious, you can.
I don’t. I wouldn’t be very surprised to wake up modified in some popular way. I’m protecting the bits of me that I especially want safe.
Maybe.
Who says they’re not? (Or: Maybe living people are easier to convince.)