Only if the AI has goals that both require additional energy, and don’t have a small, bounded success condition.
For example, if an UFAI for humans has a goal that requires humans to be there, but is not allowed to create/lead to the creation of more, then if all humans are already dead it won’t do anything.
Only if the AI has goals that both require additional energy, and don’t have a small, bounded success condition.
For example, if an UFAI for humans has a goal that requires humans to be there, but is not allowed to create/lead to the creation of more, then if all humans are already dead it won’t do anything.