Problem with that approach is that how would you know that such a being is actually you? And wouldn’t sentiment like that encourage “Shoggoth” to optimise the biological people away by convincing them all to “go digital”?
I would prefer having separate mortal meat me and “immortal soul” digital me. So we could live and learn together until the mortal me eventually die.
Gradual uploading. If it values continuity of consciousness—and it should—it would determine a guaranteed way to protect that during the upload process.
And wouldn’t sentiment like that encourage “Shoggoth” to optimise the biological people away by convincing them all to “go digital”?
Yes. That’s exactly what they ought to do. Of course, perhaps it doesn’t need to; the market will do that by itself. Digital space will be far cheaper than physical. (For reference, in my vision of utopia, there would be a non-capitalist market, without usury, rent, etc. Doing things other people like buys you more matter and energy to use. Existing purely digitally would be so cheap that only tremendously wealthy people would be physical, and I’m not sure that in a sane market it would be possible for an entity with a merely human degree of intelligence to become that wealthy. Superintelligences below the world-sovereign might, but they also probably would use their allocated matter efficiently.)
To me it looks like the universe made of computronium and devoid of living humans. With the only difference with the unaligned Foom being that some of that computronium calculates our digital imitations.
EDIT: I don’t claim that “me is meat me” view is objectively right. It’s just according to my purely subjective values people are biological people and me is a biological me. Digital being can be our children and successors, but I don’t identify myself with them.
You may view digital you as your true self. I respect that. But I really don’t want an AI that forces your values on me (or my on yours). Or AI that makes people compete with AIs for the right to be alive, because it’s obvious that we have no chance in that competition. If we have AI that maximizes intelligence, is it really that different from “papperclip optimizer” that “can find a better use for your atoms”?
Problem with that approach is that how would you know that such a being is actually you? And wouldn’t sentiment like that encourage “Shoggoth” to optimise the biological people away by convincing them all to “go digital”?
I would prefer having separate mortal meat me and “immortal soul” digital me. So we could live and learn together until the mortal me eventually die.
Gradual uploading. If it values continuity of consciousness—and it should—it would determine a guaranteed way to protect that during the upload process.
Yes. That’s exactly what they ought to do. Of course, perhaps it doesn’t need to; the market will do that by itself. Digital space will be far cheaper than physical. (For reference, in my vision of utopia, there would be a non-capitalist market, without usury, rent, etc. Doing things other people like buys you more matter and energy to use. Existing purely digitally would be so cheap that only tremendously wealthy people would be physical, and I’m not sure that in a sane market it would be possible for an entity with a merely human degree of intelligence to become that wealthy. Superintelligences below the world-sovereign might, but they also probably would use their allocated matter efficiently.)
To me it looks like the universe made of computronium and devoid of living humans. With the only difference with the unaligned Foom being that some of that computronium calculates our digital imitations.
EDIT: I don’t claim that “me is meat me” view is objectively right. It’s just according to my purely subjective values people are biological people and me is a biological me. Digital being can be our children and successors, but I don’t identify myself with them.
You may view digital you as your true self. I respect that. But I really don’t want an AI that forces your values on me (or my on yours). Or AI that makes people compete with AIs for the right to be alive, because it’s obvious that we have no chance in that competition. If we have AI that maximizes intelligence, is it really that different from “papperclip optimizer” that “can find a better use for your atoms”?