(If astronomers found a giant meteor projected to hit the earth in the year 2123, nobody would question the use of the term “existential threat”, right??)
I’m wondering if you believe that AGI risk is equivalent to a giant meteor hitting the Earth or was that just a throw away analogy? This helps me get a better idea of where x-risk concern-havers (?) stand on the urgency of the risk. /gen
I was just making a narrow point: the term “existential threat” does not generally have a connotation of “imminent”.
An analysis in The Precipice concludes that the risk of extinction via “natural” asteroid or comet impact is around 1 in 1,000,000 in the next century. I think the probability of human extinction via AI in the next century is much much much much higher than 1 in 1,000,000. If you force me to pick a number, I would say something above 50% in the next 30 years. That’s just my opinion though. There’s quite a range of opinions in the field. I generally think it’s pretty hard to say, within a pretty broad range, at least in my present state of knowledge. However, if someone says it’s below 1% in the next century, then I feel very strongly that they have not thought it through sufficiently carefully, and are overlooking important considerations.
I’m wondering if you believe that AGI risk is equivalent to a giant meteor hitting the Earth or was that just a throw away analogy? This helps me get a better idea of where x-risk concern-havers (?) stand on the urgency of the risk. /gen
Thanks
I was just making a narrow point: the term “existential threat” does not generally have a connotation of “imminent”.
An analysis in The Precipice concludes that the risk of extinction via “natural” asteroid or comet impact is around 1 in 1,000,000 in the next century. I think the probability of human extinction via AI in the next century is much much much much higher than 1 in 1,000,000. If you force me to pick a number, I would say something above 50% in the next 30 years. That’s just my opinion though. There’s quite a range of opinions in the field. I generally think it’s pretty hard to say, within a pretty broad range, at least in my present state of knowledge. However, if someone says it’s below 1% in the next century, then I feel very strongly that they have not thought it through sufficiently carefully, and are overlooking important considerations.