In the post, he says “his track record is at best fairly mixed” and “Yudkowsky may have a track record of overestimating or overstating the quality of his insights into AI”; and in the comments, he says “Yudkowsky’s track record suggests a substantial bias toward dramatic and overconfident predictions”.
Yes, I think all of that checks out. It’s hard to say, of course, because Eliezer rarely makes explicit predictions, but insofar as he does make them I think he clearly puts a lot of weight on his inside view into things.
That doesn’t make his track record “bad” but it’s something to keep in mind when he makes predictions.
Sure! “I wouldn’t have predicted AlphaGo and lost money betting against the speed of its capability gains”.
This counts as a mistake but I don’t think it’s important relative to the bad prediction about AI timelines Ben brings up in his post. If Eliezer explained why he had been wrong then it would make his position now more convincing, especially given his condescending attitude towards e.g. Metaculus forecasts.
I still think there’s something about the way Eliezer admits he was wrong that rubs me the wrong way but it’s hard to explain what that is right now. It’s not correct to say he doesn’t admit his mistakes per se, but there’s some other problem with how much he seems to “internalize” the fact that he was wrong.
I’ve retracted my original comment because of your example as it was not correct (despite having the right “vibe”, whatever that means).
Yes, I think all of that checks out. It’s hard to say, of course, because Eliezer rarely makes explicit predictions, but insofar as he does make them I think he clearly puts a lot of weight on his inside view into things.
That doesn’t make his track record “bad” but it’s something to keep in mind when he makes predictions.
This counts as a mistake but I don’t think it’s important relative to the bad prediction about AI timelines Ben brings up in his post. If Eliezer explained why he had been wrong then it would make his position now more convincing, especially given his condescending attitude towards e.g. Metaculus forecasts.
I still think there’s something about the way Eliezer admits he was wrong that rubs me the wrong way but it’s hard to explain what that is right now. It’s not correct to say he doesn’t admit his mistakes per se, but there’s some other problem with how much he seems to “internalize” the fact that he was wrong.
I’ve retracted my original comment because of your example as it was not correct (despite having the right “vibe”, whatever that means).