One of my difficulties with this is that it seems to contradict one of my core moral intuitions, that suffering is bad. It seems to contradict it because I can inflict truly heinous experiences onto my mental models without personally suffering for it, but your point of view seems to imply that I should be able to write that off just because the mental model happens to be continuous in space-time to me. Or am I misunderstanding your point of view?
To give an analogy and question of my own, what would you think about an alien unaligned AI simulating a human directly inside its own reasoning center? Such a simulated human would be continuous in spacetime with the AI, so would you consider the human to be part of the AI and not have moral value of their own?
To the first one, they aren’t actually suffering that much or experiencing anything they’d rather not experience because they’re continuous with you and you aren’t suffering.
I don’t actually think a simulated human would be continuous in spacetime with the AI because the computation wouldn’t be happening inside of the qualia-having parts of the AI.
One of my difficulties with this is that it seems to contradict one of my core moral intuitions, that suffering is bad. It seems to contradict it because I can inflict truly heinous experiences onto my mental models without personally suffering for it, but your point of view seems to imply that I should be able to write that off just because the mental model happens to be continuous in space-time to me. Or am I misunderstanding your point of view?
To give an analogy and question of my own, what would you think about an alien unaligned AI simulating a human directly inside its own reasoning center? Such a simulated human would be continuous in spacetime with the AI, so would you consider the human to be part of the AI and not have moral value of their own?
To the first one, they aren’t actually suffering that much or experiencing anything they’d rather not experience because they’re continuous with you and you aren’t suffering.
I don’t actually think a simulated human would be continuous in spacetime with the AI because the computation wouldn’t be happening inside of the qualia-having parts of the AI.