Wouldn’t it be rational to assume, that what/whoever designed the simulation, would do so for for the same reason that we know all inteligent life complies to: Survival/reproduction and maximizing its pleasure / minimizing pain?
I see two problems with this:
Alien minds are alien, and
that really doesn’t seem to exhaust the motives of intelligent life. It would seem to recommend wireheading to us.
If alien means “not comprehensible” (not even through our best magination), then it’s folly to talk about such a thing. If we cannot even imagine something to be realistically possible—then for all practical purposes (until objectively shown otherwise) it isnt.
Or using modal logic—Possiblly possible = not realistically possible. Physically/logically possible = realistically possible. The later always has bigger weight and by Occam = higher possibility (higher chance to be correct/be closert to truth)
If we imagine the designer is not acting irrationaly or random—then all potential motives go into survival/reproduction and max. p/p. The notion of max. p/p is directly related to the stage of inteligence and self-awareness of the organism—but survival/reproduction is hardwired in all the evolutionary types of life we know.
By “alien” I really did just mean “different”. There are comprehensible possible minds that are nothing like ours.
If we imagine the designer is not acting irrationaly or random—then all potential motives go into survival/reproduction and max. p/p.
I don’t think this is true. Imagine Omega comes to you and says, “Look, I can cure death—nobody will ever die ever again, and the only price you have to pay for this is a) you can never have children, and b) your memory will be wiped, and you will be continuously misled, so that you still think people are dying. To you, the world won’t look any different. Will you take this deal?” I don’t think it would be acting randomly or irrationally to take that deal—big, big gain for relatively little cost, even though your (personal) survival and reproduction and (personal) max. p/p. aren’t affected by it. Humans have complicated values—there are lots of things that motivate us. There’s no reason to assume that the simulation-makers would be simpler.
I see two problems with this:
Alien minds are alien, and
that really doesn’t seem to exhaust the motives of intelligent life. It would seem to recommend wireheading to us.
If alien means “not comprehensible” (not even through our best magination), then it’s folly to talk about such a thing. If we cannot even imagine something to be realistically possible—then for all practical purposes (until objectively shown otherwise) it isnt. Or using modal logic—Possiblly possible = not realistically possible. Physically/logically possible = realistically possible. The later always has bigger weight and by Occam = higher possibility (higher chance to be correct/be closert to truth)
If we imagine the designer is not acting irrationaly or random—then all potential motives go into survival/reproduction and max. p/p. The notion of max. p/p is directly related to the stage of inteligence and self-awareness of the organism—but survival/reproduction is hardwired in all the evolutionary types of life we know.
By “alien” I really did just mean “different”. There are comprehensible possible minds that are nothing like ours.
I don’t think this is true. Imagine Omega comes to you and says, “Look, I can cure death—nobody will ever die ever again, and the only price you have to pay for this is a) you can never have children, and b) your memory will be wiped, and you will be continuously misled, so that you still think people are dying. To you, the world won’t look any different. Will you take this deal?” I don’t think it would be acting randomly or irrationally to take that deal—big, big gain for relatively little cost, even though your (personal) survival and reproduction and (personal) max. p/p. aren’t affected by it. Humans have complicated values—there are lots of things that motivate us. There’s no reason to assume that the simulation-makers would be simpler.