Well. That’s not an existential risk, but it would be bad if we had a sadistic upload in charge. But I think that if we had enough knowledge of neuroscience to create WBE, then we should be able to eliminate the pathologies of the mind that create deranged lunatics, sadists, and psychopaths. Who would want to be like that anyway, when the alternative is to live in a digitally created state of bliss? You could still be part of what you consider to be “reality”, so that you wouldn’t feel bad if you were in a “fake” virtual reality.
First, I believe the creation of em-hell is worse than human extinction. Second, I have no idea how neurologically different sociopaths are.
Who would want to be like that anyway, when the alternative is to live in a digitally created state of bliss?
At least part of me prefers friendship with real people to being wireheaded into pure bliss. Another part of me values winning zero-sum games against real people. Although no part of me values torturing innocent people, I can certainly understand why a monster-em would prefer ruling over em-hell to being wireheaded.
agree with the point of em-hell. But I don’t think it’s very likely because I think that you could screen against sociopaths being uploaded.
The full quote:
Who would want to be like that anyway, when the alternative is to live in a digitally created state of bliss? You could still be part of what you consider to be “reality”, so that you wouldn’t feel bad if you were in a “fake” virtual reality.
You could still have real friends. What zero-sum games do you have in mind? Surely anything you find enjoyable now pales in comparison to what’s possible and likely with WBE?
So, our monster-em would be an existential risk IF all these conditions are met:
it gets through a screening process.
it prefers staying psychopathic to enjoying em-bliss.
it somehow gains power to torture not only a few ems, but all the ems.
it prefers to torture real people/ems rather than torturing things that are like highly realistic videogame simulations that aren’t conscious.
First, I value friendships with real people. If I were an em, I would probably value friendships with real ems and not unconscious simulations.
If I valued torturing real people, em-version of me would probably value torturing real ems and not unconscious simulations.
Not all humans have pleasure as their summum bonum. Not all humans would want em-bliss.
Second, you do not know what this screening process would entail and whether it would be possible to fool it. You also do not know how unfriendly neurotypical ems could become.
Third, yes. Em-hell is unlikely. However, if it is possible for a monster-em to arise and gain power, Pascalian reasoning begins to apply.
Fourth, em-hell and em-heaven are both fantasies. Other hypothetical futures probably deserve more focus.
An unfriendly AI would probably just kill us. An unfriendly em? A human wrote The 120 Days of Sodom.
Well. That’s not an existential risk, but it would be bad if we had a sadistic upload in charge. But I think that if we had enough knowledge of neuroscience to create WBE, then we should be able to eliminate the pathologies of the mind that create deranged lunatics, sadists, and psychopaths. Who would want to be like that anyway, when the alternative is to live in a digitally created state of bliss? You could still be part of what you consider to be “reality”, so that you wouldn’t feel bad if you were in a “fake” virtual reality.
First, I believe the creation of em-hell is worse than human extinction. Second, I have no idea how neurologically different sociopaths are.
At least part of me prefers friendship with real people to being wireheaded into pure bliss. Another part of me values winning zero-sum games against real people. Although no part of me values torturing innocent people, I can certainly understand why a monster-em would prefer ruling over em-hell to being wireheaded.
Our monster-em could perhaps endorse this:
http://lesswrong.com/lw/lb/not_for_the_sake_of_happiness_alone/
agree with the point of em-hell. But I don’t think it’s very likely because I think that you could screen against sociopaths being uploaded.
The full quote:
You could still have real friends. What zero-sum games do you have in mind? Surely anything you find enjoyable now pales in comparison to what’s possible and likely with WBE?
So, our monster-em would be an existential risk IF all these conditions are met:
it gets through a screening process.
it prefers staying psychopathic to enjoying em-bliss.
it somehow gains power to torture not only a few ems, but all the ems.
it prefers to torture real people/ems rather than torturing things that are like highly realistic videogame simulations that aren’t conscious.
No other ems are able/willing to stop it.
Doesn’t that seem unlikely?
First, I value friendships with real people. If I were an em, I would probably value friendships with real ems and not unconscious simulations.
If I valued torturing real people, em-version of me would probably value torturing real ems and not unconscious simulations.
Not all humans have pleasure as their summum bonum. Not all humans would want em-bliss.
Second, you do not know what this screening process would entail and whether it would be possible to fool it. You also do not know how unfriendly neurotypical ems could become.
Third, yes. Em-hell is unlikely. However, if it is possible for a monster-em to arise and gain power, Pascalian reasoning begins to apply.
Fourth, em-hell and em-heaven are both fantasies. Other hypothetical futures probably deserve more focus.