(I clicked through to see your other comments after disagreeing with one. Generally, I like your comments!)
I think that EA writers and culture are less “lost” than you think, on this axis. I think that most EA/rationalist/ex-risk-focused people in this subculture would basically agree with you that the knowledge explosion/recursive acceleration of technological development is the core problem, and when they talk about “AI safety” and so forth, they’re somewhat shorthanding this.
Like, I think most of the people around here are, in fact, worried about some of the products rolling off the end of the assembly line, but would also pretty much immediately concur with you that the assembly line itself is the root problem, or at least equally important.
I can’t actually speak for everybody, of course, but I think you might be docking people more points than you should.
I think that EA writers and culture are less “lost” than you think, on this axis. I think that most EA/rationalist/ex-risk-focused people in this subculture would basically agree with you that the knowledge explosion/recursive acceleration of technological development is the core problem
Ok, where are their articles on the subject? What I see so far are a ton of articles about AI, and nothing about the knowledge explosion unless I wrote it. I spent almost all day every day for a couple weeks on the EA forum, and observed the same thing there.
That said, I’m here because the EA community is far more interested in X risk than the general culture and the vast majority of intellectual elites, and I think that’s great. I’m hoping to contribute by directing some attention away from symptoms and towards sources. This is obviously a debatable proposition and I’m happy to see it debated, no problem.
(I clicked through to see your other comments after disagreeing with one. Generally, I like your comments!)
I think that EA writers and culture are less “lost” than you think, on this axis. I think that most EA/rationalist/ex-risk-focused people in this subculture would basically agree with you that the knowledge explosion/recursive acceleration of technological development is the core problem, and when they talk about “AI safety” and so forth, they’re somewhat shorthanding this.
Like, I think most of the people around here are, in fact, worried about some of the products rolling off the end of the assembly line, but would also pretty much immediately concur with you that the assembly line itself is the root problem, or at least equally important.
I can’t actually speak for everybody, of course, but I think you might be docking people more points than you should.
Hi Duncan, thanks for engaging.
Ok, where are their articles on the subject? What I see so far are a ton of articles about AI, and nothing about the knowledge explosion unless I wrote it. I spent almost all day every day for a couple weeks on the EA forum, and observed the same thing there.
That said, I’m here because the EA community is far more interested in X risk than the general culture and the vast majority of intellectual elites, and I think that’s great. I’m hoping to contribute by directing some attention away from symptoms and towards sources. This is obviously a debatable proposition and I’m happy to see it debated, no problem.