I agree that small differences in growth rates between firms or countries, compounded over many doublings of total output, will lead to large differences in final output. But I think there are quite a lot of other moving steps in this story before you get to the need for a pivotal act. It seems like you aren’t pointing to the concentration of power per se (if so, I think your remedies would look like normal boring stuff like corporate governance!), I think you are making way more opinionated claims about the risk posed by misalignment.
Most proximately, I don’t think that “modestly reduce the cost of alignment” or “modestly slow the development or deployment of unaligned AI” need to look like pivotal acts. It seems like humans can do those things a bit, and plausibly with no AI assistance can do them at >1 year per year of delay. AI assistance could help humans do those things better, improving our chances of getting over 1 year per year of delay. Modest governance changes could reduce the risk each year of catastrophe. You don’t necessarily have to delay that long in calendar time in order to get alignment solutions. etc.
I agree that small differences in growth rates between firms or countries, compounded over many doublings of total output, will lead to large differences in final output. But I think there are quite a lot of other moving steps in this story before you get to the need for a pivotal act. It seems like you aren’t pointing to the concentration of power per se (if so, I think your remedies would look like normal boring stuff like corporate governance!), I think you are making way more opinionated claims about the risk posed by misalignment.
Most proximately, I don’t think that “modestly reduce the cost of alignment” or “modestly slow the development or deployment of unaligned AI” need to look like pivotal acts. It seems like humans can do those things a bit, and plausibly with no AI assistance can do them at >1 year per year of delay. AI assistance could help humans do those things better, improving our chances of getting over 1 year per year of delay. Modest governance changes could reduce the risk each year of catastrophe. You don’t necessarily have to delay that long in calendar time in order to get alignment solutions. etc.