I’m using a weird definition of AI here, basically “an AI that can do my job”. I’m imagining a cobbled-together system of transformers that individually automates everything I can do, thereby replacing most of the information jobs like coding, scientific research, and advertising. So in a lot of the worlds where AGI happens, there’s no hard takeoff. AIs are helping do AI research, and maybe labor isn’t a major limiting factor in AI development anymore. But there isn’t a >1 OOM increase in AI research output from AI.
This also means that I think in most of the 30% there is no hard takeoff. Some low-hanging fruit is picked by machines, but not enough for a FOOM.
Thanks for bringing up the contradiction, though. I really need to go back and clarify a lot of my statements.
How is...
...consistent with...
...?
Doesn’t AGI imply the latter?
I’m using a weird definition of AI here, basically “an AI that can do my job”. I’m imagining a cobbled-together system of transformers that individually automates everything I can do, thereby replacing most of the information jobs like coding, scientific research, and advertising. So in a lot of the worlds where AGI happens, there’s no hard takeoff. AIs are helping do AI research, and maybe labor isn’t a major limiting factor in AI development anymore. But there isn’t a >1 OOM increase in AI research output from AI.
This also means that I think in most of the 30% there is no hard takeoff. Some low-hanging fruit is picked by machines, but not enough for a FOOM.
Thanks for bringing up the contradiction, though. I really need to go back and clarify a lot of my statements.