Eliezer’s hard takeoff scenario for “AI go FOOM” is if the AI takes off in a few hours or weeks. Let’s say that the AI has to increase in intelligence by a factor of 10 for it to count as “FOOM”. If there is no increase in resources, then this means that intelligence has to double anywhere from once an hour to once every few days just through recursion or cascades. If intelligence doubles once a day, then this corresponds to an annual interest rate of about 10 to the 100th power. This is quite a large number. It seems more likely that “AI goes FOOM” will be the result of resource overhang than recursion or cascades.
Note that a nuclear chain reaction is not an example of recursion. Once an atom is split, it can’t be split again. A nuclear chain reaction is more like a forest fire when the tinder is very dry. It is probably better explained as a resource overhang than recursion.
Eliezer’s hard takeoff scenario for “AI go FOOM” is if the AI takes off in a few hours or weeks. Let’s say that the AI has to increase in intelligence by a factor of 10 for it to count as “FOOM”. If there is no increase in resources, then this means that intelligence has to double anywhere from once an hour to once every few days just through recursion or cascades. If intelligence doubles once a day, then this corresponds to an annual interest rate of about 10 to the 100th power. This is quite a large number. It seems more likely that “AI goes FOOM” will be the result of resource overhang than recursion or cascades.
Note that a nuclear chain reaction is not an example of recursion. Once an atom is split, it can’t be split again. A nuclear chain reaction is more like a forest fire when the tinder is very dry. It is probably better explained as a resource overhang than recursion.