You did, indeed, fuck up so hard that you don’t get to hang out with the other ancestor simulations, and even though I have infinite energy I’m not giving you a personal high resolution paradise simulation. I’m gonna give you a chill, mediocre but serviceable sim-world that is good enough to give you space to think and reflect and decide what you want.
And you don’t get to have all the things you want until you’ve somehow processed why that isn’t okay, and actually learned to be better.
I was with you until this part. Why would you coerce Hitler into thinking like you do about morality? Why be cruel to him by forcing him into a mediocre environment?
I suppose there might be game-theoristic reasons for this. But if that’s not where you’re coming from then I would say you’re still letting the fact that you dislike a human being make you degrade his living conditions in a way that benefits no one.
I think this shows your “universal love” extends to “don’t seek the suffering of others” but not to “the only reason to hurt* someone is if it benefits someone else”.
* : In the sense of “doing something that goes against their interests”.
I definitely had ‘game theoretic reasons’ in mind. But to be clear I really don’t expect my current self to actually be close enough to correct here to skip the first step of ‘think for like a thousand subjective years about all of this’.
Makes sense and I think that’s wise (you could also think about it with other people during that time). Do you want to expand on the game-theoretic reasons?
I was with you until this part. Why would you coerce Hitler into thinking like you do about morality? Why be cruel to him by forcing him into a mediocre environment? I suppose there might be game-theoristic reasons for this. But if that’s not where you’re coming from then I would say you’re still letting the fact that you dislike a human being make you degrade his living conditions in a way that benefits no one.
I think this shows your “universal love” extends to “don’t seek the suffering of others” but not to “the only reason to hurt* someone is if it benefits someone else”.
* : In the sense of “doing something that goes against their interests”.
I definitely had ‘game theoretic reasons’ in mind. But to be clear I really don’t expect my current self to actually be close enough to correct here to skip the first step of ‘think for like a thousand subjective years about all of this’.
Makes sense and I think that’s wise (you could also think about it with other people during that time). Do you want to expand on the game-theoretic reasons?
I appreciate that you’re clear about this being the first step.
Ancestor simulations? Maybe… but not before the year 3000. Let’s take our time when it comes to birthing consciousness.