There exists a square-cube law (or something similar) so that computation becomes less and less efficient or precise or engineerable as the size of the computer or the data it processes increases, so that a hard takeoff is impossible or takes very long such that growth isn’t perceived as “explosive” growth. Thus, if and when strong AI is developed, it doesn’t go FOOM, and things change slowly enough that humans don’t notice anything.
There exists a square-cube law (or something similar) so that computation becomes less and less efficient or precise or engineerable as the size of the computer or the data it processes increases, so that a hard takeoff is impossible or takes very long such that growth isn’t perceived as “explosive” growth. Thus, if and when strong AI is developed, it doesn’t go FOOM, and things change slowly enough that humans don’t notice anything.