Yes, but we need to add “Humanity goes extinct before this date” which is also possible. (((
Sufficiently large catastrophe could prevent AI creation, like supervirus or nuclear war.
That would be another way for exponential growth in human AI research to stop, yes. You can think of it as one of the options under “(etc.)”, or as a special case of “not enough resources”.
Yes, but we need to add “Humanity goes extinct before this date” which is also possible. ((( Sufficiently large catastrophe could prevent AI creation, like supervirus or nuclear war.
That would be another way for exponential growth in human AI research to stop, yes. You can think of it as one of the options under “(etc.)”, or as a special case of “not enough resources”.