To teach million kids you need like hundred thousand teachers from Dath Ilani. They don’t currently exist.
It can be circumvented by first teaching say a hundred students, 10% of which becomes teachers and help teaching new ‘generation’. If each ‘generation’ takes 5 years, and one teacher can teach 10 students in one generation, the amount of teachers will be multiplied by 2 every 5 years, and you’ll get a million Dath Ilanians in like 50 years.
One teacher teaching 10 students and 1 of them becoming a teacher might be more possible than it seems. For example, if instead of Dath Ilani ways we speak about non-terrible level of math, then I’ve worked in a system that have 1 math teacher per 6-12 students, 3%-5% of students become teachers and generation takes 3-6 years.
The problem is, currently we have 0 Dath Ilani teachers.
Anyone up for creating thousands of clones of von Neumann and raising them to think that AI alignment is a really important problem? I’d trust them over me!
Setting aside this proposal’s, ah, logistical difficulties, I certainly don’t think we should ignore interventions that target only the (say) 10% of the probability space in which superintelligence takes longest to appear.
So, is it now the consensus opinion round here that we’re all dead in less than twenty years? (Sounds about right to me, but I’ve always been a pessimist...)
It’s not consensus. Ajeya, Richard, Paul, and Rohin are prominent examples of people widely considered to have expertise on this topic who think it’s not true. (I think they’d say something more like 10% chance? IDK)
How much money are we talking? Can’t we just get million kids and teach them dath ilani way to at least solve >20 years timelines?
To teach million kids you need like hundred thousand teachers from Dath Ilani. They don’t currently exist.
It can be circumvented by first teaching say a hundred students, 10% of which becomes teachers and help teaching new ‘generation’. If each ‘generation’ takes 5 years, and one teacher can teach 10 students in one generation, the amount of teachers will be multiplied by 2 every 5 years, and you’ll get a million Dath Ilanians in like 50 years.
One teacher teaching 10 students and 1 of them becoming a teacher might be more possible than it seems. For example, if instead of Dath Ilani ways we speak about non-terrible level of math, then I’ve worked in a system that have 1 math teacher per 6-12 students, 3%-5% of students become teachers and generation takes 3-6 years.
The problem is, currently we have 0 Dath Ilani teachers.
Anyone up for creating thousands of clones of von Neumann and raising them to think that AI alignment is a really important problem? I’d trust them over me!
I don’t think they’d even need to be raised to think that; they’d figure it out on their own. Unfortunately we don’t have enough time.
Setting aside this proposal’s, ah, logistical difficulties, I certainly don’t think we should ignore interventions that target only the (say) 10% of the probability space in which superintelligence takes longest to appear.
So, is it now the consensus opinion round here that we’re all dead in less than twenty years? (Sounds about right to me, but I’ve always been a pessimist...)
It’s not consensus. Ajeya, Richard, Paul, and Rohin are prominent examples of people widely considered to have expertise on this topic who think it’s not true. (I think they’d say something more like 10% chance? IDK)
This author is: https://fantasticanachronism.com/2021/03/23/two-paths-to-the-future/
“I believe the best choice is cloning. More specifically, cloning John von Neumann one million times”