Agreement karma indicates agreement, separate from overall quality.
Point taken. This post seems unlikely to reach those people. Is it possible to communicate the importance of x-risks in such a short space to SL0′s—maybe without mentioning exotic technologies? And would they change their charitable behavior?
I suspect the first answer is yes and the second is no (not without lots of other bits of explanation).
I agree with your estimates/answers. There are certainly SL0 existential risks (most people in the US understand nuclear war), but I think the issue in question is that the risks most targeted by the “x-risks community” are above those levels—asteroid strikes are SL2, nanotech is SL3, AI-foom is SL4. I think most people understand that x-risks are important in an abstract sense but have very limited understanding of what the risks the community is targeting actually represent.
2 votes
Overall karma indicates overall quality.
0 votes
Agreement karma indicates agreement, separate from overall quality.
Point taken. This post seems unlikely to reach those people. Is it possible to communicate the importance of x-risks in such a short space to SL0′s—maybe without mentioning exotic technologies? And would they change their charitable behavior?
I suspect the first answer is yes and the second is no (not without lots of other bits of explanation).
3 votes
Overall karma indicates overall quality.
0 votes
Agreement karma indicates agreement, separate from overall quality.
I agree with your estimates/answers. There are certainly SL0 existential risks (most people in the US understand nuclear war), but I think the issue in question is that the risks most targeted by the “x-risks community” are above those levels—asteroid strikes are SL2, nanotech is SL3, AI-foom is SL4. I think most people understand that x-risks are important in an abstract sense but have very limited understanding of what the risks the community is targeting actually represent.