According to your survey, 38.5% of all people have read at least 75% of the Sequences yet only 16.5% think that unfriendly AI is the most fearsome existential risk.
So what? I’m not even sure that Eliezer himself considers uFAI the most likely source of extinction. It’s just that Friendly AI would help save us from most the other possible sources of extinction too (not just from uFAI), and from several other sources of suffering too (not just extinction), so it kills multiple birds with one stone to figure it out.
As a point of note, I myself didn’t place uFAI as the most likely existential risk in that survey. That doesn’t mean I share your attitude.
So what? I’m not even sure that Eliezer himself considers uFAI the most likely source of extinction. It’s just that Friendly AI would help save us from most the other possible sources of extinction too (not just from uFAI), and from several other sources of suffering too (not just extinction), so it kills multiple birds with one stone to figure it out.
As a point of note, I myself didn’t place uFAI as the most likely existential risk in that survey. That doesn’t mean I share your attitude.