But I never hear EA folks talk about these kinds of issues, and these ideas don’t seem to resonate with the community when I bring them up. I’m still left wondering, what is the disconnect here?
I’ve noticed that LW is generally more cynical about civilization (probably inspired by EY) compared to EA. You can see it in the framings. The “reducing existential risk” framing focuses on risks, not on making something happen that gives us more control than we currently have. The implicit theme is that “humans will be in control by default.” Whereas the way EY frames things, it’s more like “there’s a big challenge coming up but civilization isn’t reacting properly; we have a lot of work to do forming pockets of sanity and saving everyone with the help of magic-like technology which we’re forced to develop and master fast enough (because others are going to fuck it up by default).”
I’ve noticed that LW is generally more cynical about civilization (probably inspired by EY) compared to EA. You can see it in the framings. The “reducing existential risk” framing focuses on risks, not on making something happen that gives us more control than we currently have. The implicit theme is that “humans will be in control by default.” Whereas the way EY frames things, it’s more like “there’s a big challenge coming up but civilization isn’t reacting properly; we have a lot of work to do forming pockets of sanity and saving everyone with the help of magic-like technology which we’re forced to develop and master fast enough (because others are going to fuck it up by default).”