Have you read Nick Bostrom’s paper, Astronomical Waste? You don’t have to be able to affect the probabilities by very much for existential risk to be the thing to worry about, especially if you have a decent dose of credence in utilitarianism.
Is there a decent chance, in your view, of decrease x-risk by 10^-18 if you put all of your resources into it? That could be enough. (I agree that this kind of argument is worrisome; maybe expected utility theory or utilitarianism breaks down with these huge numbers and tiny probabilities, but it is worth thinking about.)
If you’re sold on x-risk, are there some candidate other things that might have higher expectations of x-risk reductions on the margin (after due reflection)? (I’m not saying SIAI clearly wins, I just want to know what else you’re thinking about.)
Have you read Nick Bostrom’s paper, Astronomical Waste? You don’t have to be able to affect the probabilities by very much for existential risk to be the thing to worry about, especially if you have a decent dose of credence in utilitarianism.
Is there a decent chance, in your view, of decrease x-risk by 10^-18 if you put all of your resources into it? That could be enough.
I agree with you about what you say above. I personally believe that it is possible to individuals to decrease existential risk by more than 10^(-18) (though I know reasonable people who have at one time or other thought otherwise).
If you’re sold on x-risk, are there some candidate other things that might have higher expectations of x-risk reductions on the margin (after due reflection)? (I’m not saying SIAI clearly wins, I just want to know what else you’re thinking about.
Two points to make here:
(i) Though there’s huge uncertainty in judging these sorts of things and I’m by no means confident in my view on this matter, I presently believe that SIAI is increasing existential risk through unintended negative consequences. I’ve written about this in various comments, for example here, here and here.
(ii) I’ve thought a fair amount about other ways in which one might hope to reduce existential risk. I would cite the promotion and funding of an asteroid strike prevention program as a possible candidate. As I discuss here, placing money in a donor advised fund may be the best option. I wrote out much more detailed thoughts on these points which I can send you by email if you want (just PM me) but which are not yet ready for posting in public.
I agree that ‘poisoning the meme’ is a real danger, and that SIAI has historically had both positives and negatives with respect to its reputational effects. My net expectation for it at the moment is positive, but I’ll be interested to hear your analysis when it’s ready. [Edit: apparently the analysis was about asteroids, not reputation.]
Here’s the Fidelity Charitable Gift Fund for Americans. I’m skeptical about asteroid in light of recent investments in that area and technology curve, although there is potential for demonstration effects (good and bad) with respect to more likely risks.
Have you read Nick Bostrom’s paper, Astronomical Waste? You don’t have to be able to affect the probabilities by very much for existential risk to be the thing to worry about, especially if you have a decent dose of credence in utilitarianism.
Is there a decent chance, in your view, of decrease x-risk by 10^-18 if you put all of your resources into it? That could be enough. (I agree that this kind of argument is worrisome; maybe expected utility theory or utilitarianism breaks down with these huge numbers and tiny probabilities, but it is worth thinking about.)
If you’re sold on x-risk, are there some candidate other things that might have higher expectations of x-risk reductions on the margin (after due reflection)? (I’m not saying SIAI clearly wins, I just want to know what else you’re thinking about.)
I agree with you about what you say above. I personally believe that it is possible to individuals to decrease existential risk by more than 10^(-18) (though I know reasonable people who have at one time or other thought otherwise).
Two points to make here:
(i) Though there’s huge uncertainty in judging these sorts of things and I’m by no means confident in my view on this matter, I presently believe that SIAI is increasing existential risk through unintended negative consequences. I’ve written about this in various comments, for example here, here and here.
(ii) I’ve thought a fair amount about other ways in which one might hope to reduce existential risk. I would cite the promotion and funding of an asteroid strike prevention program as a possible candidate. As I discuss here, placing money in a donor advised fund may be the best option. I wrote out much more detailed thoughts on these points which I can send you by email if you want (just PM me) but which are not yet ready for posting in public.
I agree that ‘poisoning the meme’ is a real danger, and that SIAI has historically had both positives and negatives with respect to its reputational effects. My net expectation for it at the moment is positive, but I’ll be interested to hear your analysis when it’s ready. [Edit: apparently the analysis was about asteroids, not reputation.]
Here’s the Fidelity Charitable Gift Fund for Americans. I’m skeptical about asteroid in light of recent investments in that area and technology curve, although there is potential for demonstration effects (good and bad) with respect to more likely risks.