In this case, it seems like the best policy is cryopreserving then letting them stay dead but extracting those experiences and inserting them in new minds.
Which sounds weird when you say it like that, but is functionally equivalent to many of the scenarios you would intuitively expect and find good, like radically improving minds and linking them into bigger ones before waking them up since anything else would leave them unable to meaningfully interact with anything anyway and human-level minds are unlikely to qualify for informed consent.
In this case, it seems like the best policy is cryopreserving then letting them stay dead but extracting those experiences and inserting them in new minds.
Which sounds weird when you say it like that, but is functionally equivalent to many of the scenarios you would intuitively expect and find good, like radically improving minds and linking them into bigger ones before waking them up since anything else would leave them unable to meaningfully interact with anything anyway and human-level minds are unlikely to qualify for informed consent.