In the last weeks, I saw some posts or comments arguing why it would be in the self-interest of an extremely powerful AI to leave some power or habitat or whatever to humans. This seems to try to be an answer to the briader question “why should AI dobthings that we want even though we are powerless?” But it skips the complicqted question “What do we actually want an AI to do?” If we can answer that second question, then maybe the whole “please don’t do things that we really do not want” quest becomes easier to solve.
In the last weeks, I saw some posts or comments arguing why it would be in the self-interest of an extremely powerful AI to leave some power or habitat or whatever to humans. This seems to try to be an answer to the briader question “why should AI dobthings that we want even though we are powerless?” But it skips the complicqted question “What do we actually want an AI to do?” If we can answer that second question, then maybe the whole “please don’t do things that we really do not want” quest becomes easier to solve.