I think there’s a delay in outreach for three reasons.
There’s substantial conflict within the community about the effect of doing that outreach. Trying to sound the alarm might just convince the whole world that AGI is imminent, and the first one there controls the world. That would accelerate progress dramatically. For some reason, normies do not seem to understand this. But the compelling logic would convince many if there were efforts to get everyone to think about it. This is why I’ve kept my mouth largely closed, and probably why many others have as well.
We as a community strongly believe it won’t work. We assume that the coordination problems are too large. But we don’t think about it a ton, for multiple reasons including 1 and 3 here. There are strong arguments that we should at least think about it more.
The types of people who tend to take abstract arguments, like AGI risk, seriously are typically not the types of people who want to take on massive social projects. There are many exceptions, like Rob Miles, but I think the averages make a difference in our approach as a community.
I do think the community is moving toward focusing more on this angle. And that we probably should.
I think there’s a delay in outreach for three reasons.
There’s substantial conflict within the community about the effect of doing that outreach. Trying to sound the alarm might just convince the whole world that AGI is imminent, and the first one there controls the world. That would accelerate progress dramatically. For some reason, normies do not seem to understand this. But the compelling logic would convince many if there were efforts to get everyone to think about it. This is why I’ve kept my mouth largely closed, and probably why many others have as well.
We as a community strongly believe it won’t work. We assume that the coordination problems are too large. But we don’t think about it a ton, for multiple reasons including 1 and 3 here. There are strong arguments that we should at least think about it more.
The types of people who tend to take abstract arguments, like AGI risk, seriously are typically not the types of people who want to take on massive social projects. There are many exceptions, like Rob Miles, but I think the averages make a difference in our approach as a community.
I do think the community is moving toward focusing more on this angle. And that we probably should.