It’s possible, but it can also be possible that at some threshold of intelligence it finds a pathway which is richer and much more interesting than what we observe as humans (compared it to earthworms knowing of nothing but dirt), and leave for the greener pastures.
The problem is that humanity’s behavior will wipe humanity out: if first AGI will miniaturize into some quantum world, we will create the second one.
It’s possible, but it can also be possible that at some threshold of intelligence it finds a pathway which is richer and much more interesting than what we observe as humans (compared it to earthworms knowing of nothing but dirt), and leave for the greener pastures.
I mean that if that’s what happens, we will redefine intelligence and try to build something, that doesn’t leave.