One concept in my moral system relies on the question of how you would respond to permanent retaliation, if you would go rogue. Could you stop an endless attack on your wellbeing because you do things that other people hate? In a world with many extremely intelligent beings this could be very difficult, and even in a world with only you as the bad Super-Einstein it would at least be tiresome (or resource-inefficient), so one super intelligent individual would possibly prefer a situation where they do not need to defend themselves indefinitely. This is kind of similar to the outcome of Wait-But-Why’s concept of the cudgel (browser search for “cudgel”). Ultimately this concept relies heavily on having at least some possibility of giving a Super-Einstein a small but ineradicable pain. So in my opinion, it is not really applicable to a singularity event. But it could be useful for slower developments.
Pain can also be defined for non-biological beings. For me it is just a word indicating something undesirable hardwired into your being. And maybe there is something undesirable for everything in the universe. One rather metaphysical concept could be a virtue of inertia (described as the resistance of any physical object to any change in its velocity). So you could argue, if you understand the movement of an entity (more concretely its goals), you could find a way to harm it (with another movement) which would result in “pain” for the entity. This concept is still very anthropozentric, so I am not sure, if the change in the movement could lead to or already be understood as a positive outcome for humanity. Or maybe it is not registered at all.
One concept in my moral system relies on the question of how you would respond to permanent retaliation, if you would go rogue. Could you stop an endless attack on your wellbeing because you do things that other people hate? In a world with many extremely intelligent beings this could be very difficult, and even in a world with only you as the bad Super-Einstein it would at least be tiresome (or resource-inefficient), so one super intelligent individual would possibly prefer a situation where they do not need to defend themselves indefinitely. This is kind of similar to the outcome of Wait-But-Why’s concept of the cudgel (browser search for “cudgel”). Ultimately this concept relies heavily on having at least some possibility of giving a Super-Einstein a small but ineradicable pain. So in my opinion, it is not really applicable to a singularity event. But it could be useful for slower developments.
Pain can also be defined for non-biological beings. For me it is just a word indicating something undesirable hardwired into your being. And maybe there is something undesirable for everything in the universe. One rather metaphysical concept could be a virtue of inertia (described as the resistance of any physical object to any change in its velocity). So you could argue, if you understand the movement of an entity (more concretely its goals), you could find a way to harm it (with another movement) which would result in “pain” for the entity. This concept is still very anthropozentric, so I am not sure, if the change in the movement could lead to or already be understood as a positive outcome for humanity. Or maybe it is not registered at all.