I’m not sure I believe your first clause—the final chapter of The Metamorphosis of Prime Intellect tried to propose an almost Buddhist type resurrection as a solution to the problem of fun. If the universe starts feeling too much like a game to some transhumans, I think a desire to live again as a human for a single lifetime might be somewhat common. Does that desire override the suffering that will be created for the new human consciousness that will later be merged back into the immortal transhuman? Most current humans do seem to value suffering for some reason I don’t understand yet...
Since this is perilously close to an argument about CEV now, we can probably leave that as a rhetorical question. For what it’s worth, I updated my intuitive qualitative probability of living in a simulation somewhat downward because of your statement that as you conceive of your friendly AI right now, it wouldn’t have let me reincarnate myself into my current life.
The masochists that I know seem to value suffering either for interpersonal reasons (as a demonstration of control—beyond that I’m insufficiently informed to speculate), or to establish a baseline against which pleasurable experiences seem more meaningful.
I’m not sure I believe your first clause—the final chapter of The Metamorphosis of Prime Intellect tried to propose an almost Buddhist type resurrection as a solution to the problem of fun. If the universe starts feeling too much like a game to some transhumans, I think a desire to live again as a human for a single lifetime might be somewhat common. Does that desire override the suffering that will be created for the new human consciousness that will later be merged back into the immortal transhuman? Most current humans do seem to value suffering for some reason I don’t understand yet...
Since this is perilously close to an argument about CEV now, we can probably leave that as a rhetorical question. For what it’s worth, I updated my intuitive qualitative probability of living in a simulation somewhat downward because of your statement that as you conceive of your friendly AI right now, it wouldn’t have let me reincarnate myself into my current life.
The masochists that I know seem to value suffering either for interpersonal reasons (as a demonstration of control—beyond that I’m insufficiently informed to speculate), or to establish a baseline against which pleasurable experiences seem more meaningful.