Is that because any non-trivial action could run a chance of changing the AGI, and thus the AGI wouldn’t dare do anything at all? (If (false), disregard the following. Return 0;).
That or it takes actions changing itself without caring that they would make it worse because it doesn’t know that its current algorithms are worth preserving. Your scenario is what might happen if someone notices this problem and tries to fix it by telling the AI to never modify itself, depending on how exactly they formalize ‘never modify itself’.
That or it takes actions changing itself without caring that they would make it worse because it doesn’t know that its current algorithms are worth preserving. Your scenario is what might happen if someone notices this problem and tries to fix it by telling the AI to never modify itself, depending on how exactly they formalize ‘never modify itself’.