1) 2025, 2040, No prediction.
(I don’t trust myself to figure out what the long-tail possibilities look like that fall short of “global catastrophe” but that still might abort AI research indefinitely.)
2) < 5%.
3) < 5% for hours/days. < 10% for self-modifies within a few years. About 50% chance for “helps humans develop and build super-human AI within 5 years”
4) No more.
5) No, not even close. Nuclear war or genetically engineered epidemics worry me more.
6) Neuron-level simulation of a mamalian brain, within a factor of 10 of real-time.
1) 2025, 2040, No prediction. (I don’t trust myself to figure out what the long-tail possibilities look like that fall short of “global catastrophe” but that still might abort AI research indefinitely.)
2) < 5%.
3) < 5% for hours/days. < 10% for self-modifies within a few years. About 50% chance for “helps humans develop and build super-human AI within 5 years”
4) No more.
5) No, not even close. Nuclear war or genetically engineered epidemics worry me more.
6) Neuron-level simulation of a mamalian brain, within a factor of 10 of real-time.