I think that EA writers and culture are less “lost” than you think, on this axis. I think that most EA/rationalist/ex-risk-focused people in this subculture would basically agree with you that the knowledge explosion/recursive acceleration of technological development is the core problem
Ok, where are their articles on the subject? What I see so far are a ton of articles about AI, and nothing about the knowledge explosion unless I wrote it. I spent almost all day every day for a couple weeks on the EA forum, and observed the same thing there.
That said, I’m here because the EA community is far more interested in X risk than the general culture and the vast majority of intellectual elites, and I think that’s great. I’m hoping to contribute by directing some attention away from symptoms and towards sources. This is obviously a debatable proposition and I’m happy to see it debated, no problem.
Hi Duncan, thanks for engaging.
Ok, where are their articles on the subject? What I see so far are a ton of articles about AI, and nothing about the knowledge explosion unless I wrote it. I spent almost all day every day for a couple weeks on the EA forum, and observed the same thing there.
That said, I’m here because the EA community is far more interested in X risk than the general culture and the vast majority of intellectual elites, and I think that’s great. I’m hoping to contribute by directing some attention away from symptoms and towards sources. This is obviously a debatable proposition and I’m happy to see it debated, no problem.