This high probability is most likely an artifact of non-differential extrapolation of the different ways through which AI could arrive. If you look at the curve after 2022 and try to extrapolate it backwards, you’ll end up assigning to much probability mass to a period sooner than that in which some of the grounding technologies for some versions of AGI will be created.
I suspect if those experts were asked to divide forms in which HLMI will arrive into sorts and then assigned probabilities, this would become more obvious and the numbers would be lower.
This high probability is most likely an artifact of non-differential extrapolation of the different ways through which AI could arrive. If you look at the curve after 2022 and try to extrapolate it backwards, you’ll end up assigning to much probability mass to a period sooner than that in which some of the grounding technologies for some versions of AGI will be created. I suspect if those experts were asked to divide forms in which HLMI will arrive into sorts and then assigned probabilities, this would become more obvious and the numbers would be lower.