Some people realize the difference between their immediate/naive desires and long-term/societal desires.
What you want is not always changed by you. We do after all run on corrupted hardware.
Some people realize they are acting against their long-term/societal desires in favor of their immediate/naive desires. And they judge this locally good enough to do immediately, while simultaneously feeling guilty about having damaged the long-term goal. Our brains run massively parallel, and different threads do not always agree.
For Morality-As-Given
It is not possible for a thing to both be, and not have any access to our frame of reference. That puts us instead in a glitchless escapeless agentless The Matrix, which then is indistinguishable by premise from being “The Actual Universe” in a way that invokes the law of identity to make it BE the actual universe. A delusion which never differs in any way from being an objective reality is in fact that objective reality. The difference is that we can find the stone, and test its morality to see if “You should commit suicide.” is moral.
The world in which the moral proposition is true and the world in which the moral proposition is false differs in the physical structure of its inhabitants neurologies and in its physical laws, such that the outcomes of the proposition in the universe where the proposition is true inflict upon the neurologies of its inhabitants a moral effect, and in the other universe an immoral or at least suboptimal one.
Insufficient information for meaningful answer. Prerequisites include at least the “Understanding All Biomechanical and Neuropsychological Details of All Potentially Sentient Organisms” project, and the “Unified Perfect Laws of Physics” project. That said, there can still be some statement made of immoral things even without a complete morality assembled. Without doing a lot of computation, I don’t know what 43875623746 x 3429856746 is. But with very little computation or further information, I already know the answer isn’t 5.
How any particular sentient evaluates morality is utterly irrelevant. Space Hitler can think as much as he likes that eliminating all the Space Jews is moral, and still be wrong about it if there is a morality-as-given. It just means he disagrees with reality, and is as wrong about morality as some fervently religious people are about the origins of the universe. Nothing prevents you from constructing such an entity, it will merely be wrong.
Oh wow. I seem to have predicted the tone of the argument in the next section. >_>
Am I detecting a pattern on my own, or is EY leading me intentionally, or is there even a difference?