I think a substantial part of human alignment is that humans need other humans in order to maintain their power. We have plenty of examples of humans being fine with torturing or killing millions of other humans when they have the power to do so, but torturing or killing almost all humans in their sphere of control is essentially suicide. This means that purely instrumentally, human goals have required that large numbers of humans continue to exist and function moderately well.
A superintelligent AI is primarily a threat due to the near certainty that it can devise means for maintaining power that are independent of human existence. Humans can’t do that by definition, and not due to anything about alignment.
Okay, so… does anyone have any examples of anything at all, even fictional or theoretical, that is “aligned”? Other than tautological examples like “FAI” or “God”.
I think a substantial part of human alignment is that humans need other humans in order to maintain their power. We have plenty of examples of humans being fine with torturing or killing millions of other humans when they have the power to do so, but torturing or killing almost all humans in their sphere of control is essentially suicide. This means that purely instrumentally, human goals have required that large numbers of humans continue to exist and function moderately well.
A superintelligent AI is primarily a threat due to the near certainty that it can devise means for maintaining power that are independent of human existence. Humans can’t do that by definition, and not due to anything about alignment.
Okay, so… does anyone have any examples of anything at all, even fictional or theoretical, that is “aligned”? Other than tautological examples like “FAI” or “God”.