I am interested in the trade-off between directing funds/energy towards explicitly addressing existential risk and directing funds/energy towards education. In anything but the very near term, the number of altruistic, intelligent rationalists appears to be an extremely important determinant of prosperity, chance of survival, etc. There also appears to be a lot of low hanging fruit, both related to improving the rationality of exceptionally intelligent individuals and increasing the number of moderately intelligent individuals who become exceptionally intelligent.
Right now, investment (especially of intelligent rationalist’s time) in education seems much more valuable than direct investment in existential risk reduction.
Eliezer’s assessment seems to be that the two projects have approximately balanced payoffs, so that spending time on either at the expense of the other is justified. Is this correct? How do other people here feel?
It seems to me that increasing the number of altruistic, intelligent rationalists via education is just a means of explicitly addressing existential risk, so your comment, while interesting, is not directly relevant to multifoliaterose’s post.
The question in the post is whether we should direct our energies explicitly towards risk reduction. I suspect the answer may be irrelevant at the moment, because the best way to reduce existential risk in the long term and the best way to achieve our other goals may both be through education / outreach.
My uncertainty also bears on the question: should I donate to risk reduction charities? I question whether risk reduction charities are the best approach to reducing risk.
I suspect the answer may be irrelevant at the moment, because the best way to reduce existential risk in the long term and the best way to achieve our other goals may both be through education / outreach.
Possibly, but the people you want to target for education/outreach may depend on what you’d like them to eventually do, so it still seems useful to work that out first.
My uncertainty also bears on the question: should I donate to risk reduction charities? I question whether risk reduction charities are the best approach to reducing risk.
The people running such charities have surely already thought of the idea that education/outreach is currently the best way to reduce risk. For example, SIAI is apparently already spending almost all of its money and volunteer time on education and outreach (such as LW, Eliezer’s rationality book, the visiting fellows program, the Singularity Summit).
The people running such charities have surely already thought of the idea that education/outreach is currently the best way to reduce risk. For example, SIAI is apparently already spending almost all of its money and volunteer time on education and outreach (such as LW, Eliezer’s rationality book, the visiting fellows program, the Singularity Summit).
If you believe that education has a significant effect on existential risk, then charities not explicitly concerned with existential risk may nevertheless be more effectively mitigating it as a byproduct than, say, the SIAI. In particular, you shouldn’t dismiss non risk-reducing charities out of hand because of a supposed difference of scale.
At face value, you should still expect someone with a reasonable ultimate goal to have a more effective focus. But this effect may be counteracted if there is a large difference in competence, or other mitigating factors such as social influence.
I am interested in the trade-off between directing funds/energy towards explicitly addressing existential risk and directing funds/energy towards education. In anything but the very near term, the number of altruistic, intelligent rationalists appears to be an extremely important determinant of prosperity, chance of survival, etc. There also appears to be a lot of low hanging fruit, both related to improving the rationality of exceptionally intelligent individuals and increasing the number of moderately intelligent individuals who become exceptionally intelligent.
Right now, investment (especially of intelligent rationalist’s time) in education seems much more valuable than direct investment in existential risk reduction.
Eliezer’s assessment seems to be that the two projects have approximately balanced payoffs, so that spending time on either at the expense of the other is justified. Is this correct? How do other people here feel?
It seems to me that increasing the number of altruistic, intelligent rationalists via education is just a means of explicitly addressing existential risk, so your comment, while interesting, is not directly relevant to multifoliaterose’s post.
The question in the post is whether we should direct our energies explicitly towards risk reduction. I suspect the answer may be irrelevant at the moment, because the best way to reduce existential risk in the long term and the best way to achieve our other goals may both be through education / outreach.
My uncertainty also bears on the question: should I donate to risk reduction charities? I question whether risk reduction charities are the best approach to reducing risk.
Possibly, but the people you want to target for education/outreach may depend on what you’d like them to eventually do, so it still seems useful to work that out first.
The people running such charities have surely already thought of the idea that education/outreach is currently the best way to reduce risk. For example, SIAI is apparently already spending almost all of its money and volunteer time on education and outreach (such as LW, Eliezer’s rationality book, the visiting fellows program, the Singularity Summit).
If you believe that education has a significant effect on existential risk, then charities not explicitly concerned with existential risk may nevertheless be more effectively mitigating it as a byproduct than, say, the SIAI. In particular, you shouldn’t dismiss non risk-reducing charities out of hand because of a supposed difference of scale.
At face value, you should still expect someone with a reasonable ultimate goal to have a more effective focus. But this effect may be counteracted if there is a large difference in competence, or other mitigating factors such as social influence.