To what degree is this amenability to help others actually hard-wired self-preservation? I mean, if you (Eliezer) hold that superhuman AI inevitably is coming, and that most forms of it will destroy mankind, isn’t the desire to save others from that fate the same as the desire to save yourself? Rewrite the scenario such that you save mankind with FAI but die in the process. That sounds more like altruism.
To what degree is this amenability to help others actually hard-wired self-preservation? I mean, if you (Eliezer) hold that superhuman AI inevitably is coming, and that most forms of it will destroy mankind, isn’t the desire to save others from that fate the same as the desire to save yourself? Rewrite the scenario such that you save mankind with FAI but die in the process. That sounds more like altruism.