I agree that Lex’s audience is not representative. I also think this is the biggest sample size poll on the topic that I’ve seen by at least 1 OOM, which counts for a fair amount. Perhaps my wording was wrong.
I think what is implied by the first half of the Anthropic quote is much more than 10% on AGI in the next decade. I included the second part to avoid strongly-selective quoting. It seems to me that saying >10% is mainly a PR-style thing to do to avoid seeming too weird or confident, after all it is compatible with 15% or 90%. When I read the first part of the quote, I think something like ’25% on AGI in the next 5 years, and 50% in 10 years,′ but this is not what they said and I’m going to respect their desire to write vague words.
I find this part really confusing
Sorry. This framing was useful for me and I hoped it would help others, but maybe not. I probably disagree about how strong the evidence from the existence of “sparks of AGI” is. The thing I am aiming for here is something like “imagine the set of possible worlds that look a fair amount like earth, then condition on worlds that have a “sparks of AGI” paper, then how much longer do those world have until AGI” and I think that even not knowing that much else about these worlds, they don’t have very many years.
Yep, graph is per year, I’ve updated my wording to be clearer. Thanks.
Can you clarify what you mean by this?
When I think about when we will see AGI, I try to use a variety of models weighted by how good and useful they seem. I believe that, when doing this, at least 20% of the total weight should come from models/forecasts that are based substantially in extrapolating from recent ML progress. This recent literature review is a good example of how one might use such weightings.
I agree that Lex’s audience is not representative. I also think this is the biggest sample size poll on the topic that I’ve seen by at least 1 OOM, which counts for a fair amount. Perhaps my wording was wrong.
I think what is implied by the first half of the Anthropic quote is much more than 10% on AGI in the next decade. I included the second part to avoid strongly-selective quoting. It seems to me that saying >10% is mainly a PR-style thing to do to avoid seeming too weird or confident, after all it is compatible with 15% or 90%. When I read the first part of the quote, I think something like ’25% on AGI in the next 5 years, and 50% in 10 years,′ but this is not what they said and I’m going to respect their desire to write vague words.
Sorry. This framing was useful for me and I hoped it would help others, but maybe not. I probably disagree about how strong the evidence from the existence of “sparks of AGI” is. The thing I am aiming for here is something like “imagine the set of possible worlds that look a fair amount like earth, then condition on worlds that have a “sparks of AGI” paper, then how much longer do those world have until AGI” and I think that even not knowing that much else about these worlds, they don’t have very many years.
Yep, graph is per year, I’ve updated my wording to be clearer. Thanks.
When I think about when we will see AGI, I try to use a variety of models weighted by how good and useful they seem. I believe that, when doing this, at least 20% of the total weight should come from models/forecasts that are based substantially in extrapolating from recent ML progress. This recent literature review is a good example of how one might use such weightings.
Thanks for all the links!