also a lot of people will suggest that alignment people are discredited because they all believed AGI was 3 years away, because surely that’s the only possible thing an alignment person could have believed. I plan on pointing to this and other statements similar in vibe that I’ve made over the past year or two as direct counter evidence against that
(I do think a lot of people will rightly lose credibility for having very short timelines, but I think this includes a big mix of capabilities and alignment people, and I think they will probably lose more credibility than is justified because the rest of the world will overupdate on the winter)
My timelines are roughly 50% probability on something like transformative AI by 2030, 90% by 2045, and a long tail afterward. I don’t hold this strongly either, and my views on alignment are mostly decoupled from these beliefs. But if we do get an AI winter longer than that (through means other than by government intervention, which I haven’t accounted for), I should lose some Bayes points, and it seems worth saying so publicly.
to be clear, a “winter/slowdown” in my typology is more about the vibes and could only be a few years counterfactual slowdown. like the dot-com crash didn’t take that long for companies like Amazon or Google to recover from, but it was still a huge vibe shift
also to further clarify this is not an update I’ve made recently, I’m just making this post now as a regular reminder of my beliefs because it seems good to have had records of this kind of thing (though everyone who has heard me ramble about this irl can confirm I’ve believed sometime like this for a while now)
I was someone who had shorter timelines. At this point, most of the concrete part of what I expected has happened, but the “actually AGI” thing hasn’t. I’m not sure how long the tail will turn out to be. I only say this to get it on record.
also a lot of people will suggest that alignment people are discredited because they all believed AGI was 3 years away, because surely that’s the only possible thing an alignment person could have believed. I plan on pointing to this and other statements similar in vibe that I’ve made over the past year or two as direct counter evidence against that
(I do think a lot of people will rightly lose credibility for having very short timelines, but I think this includes a big mix of capabilities and alignment people, and I think they will probably lose more credibility than is justified because the rest of the world will overupdate on the winter)
My timelines are roughly 50% probability on something like transformative AI by 2030, 90% by 2045, and a long tail afterward. I don’t hold this strongly either, and my views on alignment are mostly decoupled from these beliefs. But if we do get an AI winter longer than that (through means other than by government intervention, which I haven’t accounted for), I should lose some Bayes points, and it seems worth saying so publicly.
to be clear, a “winter/slowdown” in my typology is more about the vibes and could only be a few years counterfactual slowdown. like the dot-com crash didn’t take that long for companies like Amazon or Google to recover from, but it was still a huge vibe shift
also to further clarify this is not an update I’ve made recently, I’m just making this post now as a regular reminder of my beliefs because it seems good to have had records of this kind of thing (though everyone who has heard me ramble about this irl can confirm I’ve believed sometime like this for a while now)
I was someone who had shorter timelines. At this point, most of the concrete part of what I expected has happened, but the “actually AGI” thing hasn’t. I’m not sure how long the tail will turn out to be. I only say this to get it on record.