I think the whole point of a guardian angel AI only really makes sense if it isn’t an offshoot of the central AGI. After all if you trusted the singleton enough to want a guardian angel AI, then you will want it to be as independent from the singleton as is allowed. Whereas if you do trust the singleton AI (because say you grew up after the singularity) then I don’t really see the point of a guardian angel AI.
>I think there would be levels, and most people would want to stay at a pretty normal level and would move to more extreme levels slowly before deciding on some place to stay.
I also disagree with this insofar as as I don’t think that people “deciding on some place to stay” is a stable state of affairs under an aligned superintelligence. Since I don’t think people will want to be loop immortals if they know they are heading towards that. Similarly I don’t even know if I would consider an AGI aligned if it didn’t try ensure people understood the danger of becoming a loop immortal and try to nudge people away from it.
Though I really want to see some surveys of normal people to confirm my suspicions that most people find the idea of being an infinitely repeating loop immortal existentially horrifying.
I think the whole point of a guardian angel AI only really makes sense if it isn’t an offshoot of the central AGI. After all if you trusted the singleton enough to want a guardian angel AI, then you will want it to be as independent from the singleton as is allowed. Whereas if you do trust the singleton AI (because say you grew up after the singularity) then I don’t really see the point of a guardian angel AI.
>I think there would be levels, and most people would want to stay at a pretty normal level and would move to more extreme levels slowly before deciding on some place to stay.
I also disagree with this insofar as as I don’t think that people “deciding on some place to stay” is a stable state of affairs under an aligned superintelligence. Since I don’t think people will want to be loop immortals if they know they are heading towards that. Similarly I don’t even know if I would consider an AGI aligned if it didn’t try ensure people understood the danger of becoming a loop immortal and try to nudge people away from it.
Though I really want to see some surveys of normal people to confirm my suspicions that most people find the idea of being an infinitely repeating loop immortal existentially horrifying.