I just want to clarify that there are also “create more alignment researchers” people, not just “buy time” people and “technical alignment” people. I am legally and morally obligated to avoid anything related to “buying time”. And I also don’t touch it with a ten foot pole because it seems much much much easier and safer to double the number of people working on alignment than to halve the annual R&D of the global AI industry.
If halving the annual R&D of the global AI industry is equivalent to doubling the length of time before AGI, then I think that would be substantially more valuable than doubling the number of people working on alignment. I don’t think “grow alignment researchers by X” and “lengthen timelines by X” are equally valuable.
I just want to clarify that there are also “create more alignment researchers” people, not just “buy time” people and “technical alignment” people. I am legally and morally obligated to avoid anything related to “buying time”. And I also don’t touch it with a ten foot pole because it seems much much much easier and safer to double the number of people working on alignment than to halve the annual R&D of the global AI industry.
If halving the annual R&D of the global AI industry is equivalent to doubling the length of time before AGI, then I think that would be substantially more valuable than doubling the number of people working on alignment. I don’t think “grow alignment researchers by X” and “lengthen timelines by X” are equally valuable.
There aren’t types of people!