It seems quite likely to me a priori that “experts” would be driven to make fewer extreme predictions because they’re more interested in defending their status by adapting a moderate position and also more able to do so.
For the record, I did expect this prior to reading your analysis of the data. But I also expected the data to be more in line with the Maes-Garreau law.
I’m trying to use the outside view to combat it. It is hard for me to think up examples of experts making more extreme-sounding claims than interested amateurs. The only argument the other way that I can think of is that AI itself is so crazy that seeing it occur in less that 100 years is the extreme position, and the other way around is moderate, but I don’t find that very convincing.
In addition, I don’t see reason to believe I’m different from lukeprog or handoflixue.
Philosophy experts are very fond of saying AI is impossible, neuroscientist experts seem to often proclaim it’ll take centuries… By the time you break it down into categories and consider the different audiences and expert cultures, I think we have too little data to say much.
I would a priori assume that “experts” with quote marks are mainly interested in attention, and extreme predictions here are unlikely to get positive attention (Saying AI will happen in 75+ years is boring, saying it will happen tomorrow kills your credibility)
It seems quite likely to me a priori that “experts” would be driven to make fewer extreme predictions because they’re more interested in defending their status by adapting a moderate position and also more able to do so.
Is that really a priori? ie did you come up with that idea before seeing this post?
Did I? No.
Would I have? I’m pretty sure.
Then we’ll never know—hindsight bias is the bitchiest of bitches.
For the record, I did expect this prior to reading your analysis of the data. But I also expected the data to be more in line with the Maes-Garreau law.
I’m trying to use the outside view to combat it. It is hard for me to think up examples of experts making more extreme-sounding claims than interested amateurs. The only argument the other way that I can think of is that AI itself is so crazy that seeing it occur in less that 100 years is the extreme position, and the other way around is moderate, but I don’t find that very convincing.
In addition, I don’t see reason to believe I’m different from lukeprog or handoflixue.
Philosophy experts are very fond of saying AI is impossible, neuroscientist experts seem to often proclaim it’ll take centuries… By the time you break it down into categories and consider the different audiences and expert cultures, I think we have too little data to say much.
I would a priori assume that “experts” with quote marks are mainly interested in attention, and extreme predictions here are unlikely to get positive attention (Saying AI will happen in 75+ years is boring, saying it will happen tomorrow kills your credibility)
So, for me at least, yes.