1) If they are reset every hour of subjective time that would put some serious bounds on how much information they could usefully pass on, especially if it is in the form of a virtual book. Maybe if you rewrote the component of the upload corresponding to memory this would work, but then why bother to reset? Is it to avoid boredom? I suppose you could only rewrite a restricted part of the memory of the upload. Why not try to tweak the upload to alleviate whatever issues you are anticipating (make it not get bored ect.)?
2) Assuming this upload is actually smart enough to make any progress in taking over the world, how do you guard against them deciding that they don’t like being reset, and cleverly passing on a plan to eventually prevent you from resetting them? Even Gandhi might not appreciate being put in this sort of scenario.
3) I’m a little unsure of the effect that isolating it from all of the intellectual community might have on its effectiveness as a researcher, it seems like a large part of academic effectiveness comes from the availability of multiple perspectives. Maybe it would make more sense to try to simulate a small community of scholars rather than just one?
I can’t speak for the OP, but I imagine the reason for the reset is to prevent some sort of personality change. History generally indicates that no matter how altruistic you start out there’s a good chance you will turn nasty given enough power.
I imagine sheer boredom and the prospect of the total lack of personal freedom could also play a role in that. In any event, this still makes the transfer of memory tricky, since you want to preserve work done over time without bound, but only selectively ‘let through’ memories to avoid this sort of change of personality over time.
A possible solution would be to lengthen the time interval, at a guess you could give them a subjective week without worrying about too much personality change, making it more possible for them to successfully write down everything important.
I’m still very worried about the morality of it, as I see it the resetting amounts to mass-murder.
I would. I’d want to do some shorter test runs first though, to get used to the idea, and I’d want to be sure I was in a good mood for the main reset point.
It would probably be good to find a candidate who was enlightened in the buddhist sense, not only because they’d be generally calmer and more stable, but specifically because enlightenment involves confronting the incoherent naïve concept of self and understanding the nature of impermanence. From the enlightened perspective, the peculiar topology of the resetting subjective experience would not be a source of anxiety.
If it was determined that I was the best candidate, I would lose quite a bit of trust in the world. But if I thought it within my abilities to optimize the world an hour at a time, yes, I would volunteer.
Around the age of ten I made a precommitment that if I were ever offered an exchange of personal torment for saving the world, I should take it.
I’m still very worried about the molarity of it, as I see it the resetting amounts to mass-murder.
This is a little bit difficult to gauge. It seems like it should be roughly equivalent to a surgical memory alteration during cryogenic stasis or something like that, since you’re essentially starting the thing right back up again after removing some of the memories. In fact, I don’t see why you can’t just do a memory alteration and bypass the reset altogether, given that it seems desirable to retain some parts of the memory and not others.
Yep. And not just the whole “power corrupts” thing; having an isolated mind, with no peers, capable of direct or indirect self-modification… So many ways it can go wrong.
2) Start with someone willing to be reset, and whose willingness will extend to at least an hour. This scenario does involve sacrificing a heroic being, I do admit.
I’ve thought of a few comments:
1) If they are reset every hour of subjective time that would put some serious bounds on how much information they could usefully pass on, especially if it is in the form of a virtual book. Maybe if you rewrote the component of the upload corresponding to memory this would work, but then why bother to reset? Is it to avoid boredom? I suppose you could only rewrite a restricted part of the memory of the upload. Why not try to tweak the upload to alleviate whatever issues you are anticipating (make it not get bored ect.)?
2) Assuming this upload is actually smart enough to make any progress in taking over the world, how do you guard against them deciding that they don’t like being reset, and cleverly passing on a plan to eventually prevent you from resetting them? Even Gandhi might not appreciate being put in this sort of scenario.
3) I’m a little unsure of the effect that isolating it from all of the intellectual community might have on its effectiveness as a researcher, it seems like a large part of academic effectiveness comes from the availability of multiple perspectives. Maybe it would make more sense to try to simulate a small community of scholars rather than just one?
I can’t speak for the OP, but I imagine the reason for the reset is to prevent some sort of personality change. History generally indicates that no matter how altruistic you start out there’s a good chance you will turn nasty given enough power.
I imagine sheer boredom and the prospect of the total lack of personal freedom could also play a role in that. In any event, this still makes the transfer of memory tricky, since you want to preserve work done over time without bound, but only selectively ‘let through’ memories to avoid this sort of change of personality over time.
I fully agree.
A possible solution would be to lengthen the time interval, at a guess you could give them a subjective week without worrying about too much personality change, making it more possible for them to successfully write down everything important.
I’m still very worried about the morality of it, as I see it the resetting amounts to mass-murder.
Absolutely. We need to add a few liters of solvent to get the concentration down to acceptable molarity.
So do I. I think it’s a hideous immoral idea. Only because the lives of everyone else are in the balance would I consider it.
How about if you get saved at the end of the hour/week, not deleted?
That would be better. And then, after the dust settles, all the copies could be resurrected?
If it was determined that you were the best candidate to be Gandhi-Einstein, would you volunteer?
Only if there were no other alternatives. And yes, that is a selfish sentiment.
I would. I’d want to do some shorter test runs first though, to get used to the idea, and I’d want to be sure I was in a good mood for the main reset point.
It would probably be good to find a candidate who was enlightened in the buddhist sense, not only because they’d be generally calmer and more stable, but specifically because enlightenment involves confronting the incoherent naïve concept of self and understanding the nature of impermanence. From the enlightened perspective, the peculiar topology of the resetting subjective experience would not be a source of anxiety.
I’m not Stuart, but I would.
If it was determined that I was the best candidate, I would lose quite a bit of trust in the world. But if I thought it within my abilities to optimize the world an hour at a time, yes, I would volunteer.
Around the age of ten I made a precommitment that if I were ever offered an exchange of personal torment for saving the world, I should take it.
This is a little bit difficult to gauge. It seems like it should be roughly equivalent to a surgical memory alteration during cryogenic stasis or something like that, since you’re essentially starting the thing right back up again after removing some of the memories. In fact, I don’t see why you can’t just do a memory alteration and bypass the reset altogether, given that it seems desirable to retain some parts of the memory and not others.
Yep. And not just the whole “power corrupts” thing; having an isolated mind, with no peers, capable of direct or indirect self-modification… So many ways it can go wrong.
2) Start with someone willing to be reset, and whose willingness will extend to at least an hour. This scenario does involve sacrificing a heroic being, I do admit.
3) Maybe a reset community might work?