A couple of times I asked SIAI about the idea of splitting my donations with some other group, and of course they said that donating all of the money to them would still be the most leveraged way for me to reduce existential risks.
If you’re in doubt and seeking expert advice you should pick an expert that lacks really obvious institutional incentives to give one answer over others.
Regarding the rest of the comment I found it kind of weird and something freaked me out about it, though I’m not sure quite what. That doesn’t mean you’re doing anything wrong, I might just have biases or assumptions that make what you’re doing seem weird to me. I think it has something to do with your lack of skepticism or cynicism and the focus on looking for someone to follow that MatthewB mentioned. I guess your comment pattern matches with things a very religious person would say: I’m just not sure if that means you’re doing something wrong or if I’m having an adverse reaction to a reasonable set of behaviors because I have irrationally averse reactions to things that look religious.
Yeah, I realized that it was silly for me to ask SIAI what they thought about the idea of giving SIAI less money, but I didn’t know who else to ask, and I still didn’t have enough confidence in my own sanity to try to make this decision on my own. And I was kinda hoping that the people at SIAI were rational enough to give an accurate and reasonably unbiased answer, despite the institutional incentives. SIAI has a very real and very important mission, and I would have hoped that its members would be able to rationally think about what is best for the mission, rather than what is best for the group. And the possibility remains that they did, in fact, give a rational and mostly unbiased answer.
The answer they gave was that donating exclusively to SIAI was the most leveraged way to reduce existential risks. Yes, there are other groups that are doing important work, but SIAI is more critically underfunded than they are, and the projects that we (yes, I said “we”, even though I’m “just” a donor) are working on this year are critical for figuring out what the most optimal strategies would be for humanity/transhumanity to maximize its probability of surviving into a post-Singularity future.
heh, one of these projects they’re finally getting around to working on this year is writing a research paper examining how much existential-risk-reduction you get for each dollar donated to SIAI. That’s something I’ve really been wanting to know, and had actually been feeling kinda guilty about not making more of an effort to try to figure out on my own, or at least to try to get a vague estimate, to within a few orders of magnitude. And I had also been really annoyed that noone more qualified than me had already done this. But now they’re finally working on it. yay :)
Someone from SIAI, please correct me if I’m wrong about any of this.
And yes, my original comment seemed weird to me too, and kinda freaked me out. But I think it would have been a bad idea to deliberately avoid saying it, just because it sounds weird. If what I’m doing is a bad idea, then I need to figure this out, and find what I should be doing instead. And posting comments like this might help with that. Anyway, I realize that my way of thinking sounds weird to most people, and I don’t make any claim that this is a healthy way to think, and I’m working on fixing this.
And as I mentioned in another comment, it would just feel wrong to deliberately not say this stuff, just because it sounds weird and might make SIAI look bad. But that kind of thinking belongs to the Dark Arts, and is probably just a bad habit I had left over from christianity, and isn’t something that SIAI actually endorses, afaik.
And I do, in fact, have lots of skepticism and cynicism about SIAI, and their mission, and the people involved. This skepticism probably would have caused me to abandon them and their mission long ago… if I would have had somewhere better to go instead, or a more important mission. But after years of looking, I haven’t found any cause more important than existential risk reduction, and I haven’t found any group working towards this cause more effectively than SIAI, except possibly for some of the other groups I mentioned, but a preliminary analysis shows that they’re not actually doing any better than SIAI. And starting my own group still looks like a really silly idea.
And yes, I’m aware that I still seem to talk and think like a religious person. I was raised as a christian, and I took christianity very seriously. Seriously enough to realize that it was no good, and that I needed to get out. And so I tried to replace my religious fanaticism with what’s supposed to be an entirely non-religious and non-fanatical cause, but I still tend to think and act both religiously and fanatically. I’m working on that.
I also have an averse reaction to things that look religious. This is one of the many things causing me to have trouble with self-hatred. Anyway, I’m working on that.
Oh, and one more comment about cynicism: I currently think that 1% is an optimistic estimate of the probability that humanity/transhumanity will survive into a positive post-Singularity future, but it’s been a while since I reviewed why I believe this. Another thing to add to my to-do list.
Nothing at all against SIAI but
If you’re in doubt and seeking expert advice you should pick an expert that lacks really obvious institutional incentives to give one answer over others.
Regarding the rest of the comment I found it kind of weird and something freaked me out about it, though I’m not sure quite what. That doesn’t mean you’re doing anything wrong, I might just have biases or assumptions that make what you’re doing seem weird to me. I think it has something to do with your lack of skepticism or cynicism and the focus on looking for someone to follow that MatthewB mentioned. I guess your comment pattern matches with things a very religious person would say: I’m just not sure if that means you’re doing something wrong or if I’m having an adverse reaction to a reasonable set of behaviors because I have irrationally averse reactions to things that look religious.
Yeah, I realized that it was silly for me to ask SIAI what they thought about the idea of giving SIAI less money, but I didn’t know who else to ask, and I still didn’t have enough confidence in my own sanity to try to make this decision on my own. And I was kinda hoping that the people at SIAI were rational enough to give an accurate and reasonably unbiased answer, despite the institutional incentives. SIAI has a very real and very important mission, and I would have hoped that its members would be able to rationally think about what is best for the mission, rather than what is best for the group. And the possibility remains that they did, in fact, give a rational and mostly unbiased answer.
The answer they gave was that donating exclusively to SIAI was the most leveraged way to reduce existential risks. Yes, there are other groups that are doing important work, but SIAI is more critically underfunded than they are, and the projects that we (yes, I said “we”, even though I’m “just” a donor) are working on this year are critical for figuring out what the most optimal strategies would be for humanity/transhumanity to maximize its probability of surviving into a post-Singularity future.
heh, one of these projects they’re finally getting around to working on this year is writing a research paper examining how much existential-risk-reduction you get for each dollar donated to SIAI. That’s something I’ve really been wanting to know, and had actually been feeling kinda guilty about not making more of an effort to try to figure out on my own, or at least to try to get a vague estimate, to within a few orders of magnitude. And I had also been really annoyed that noone more qualified than me had already done this. But now they’re finally working on it. yay :)
Someone from SIAI, please correct me if I’m wrong about any of this.
And yes, my original comment seemed weird to me too, and kinda freaked me out. But I think it would have been a bad idea to deliberately avoid saying it, just because it sounds weird. If what I’m doing is a bad idea, then I need to figure this out, and find what I should be doing instead. And posting comments like this might help with that. Anyway, I realize that my way of thinking sounds weird to most people, and I don’t make any claim that this is a healthy way to think, and I’m working on fixing this.
And as I mentioned in another comment, it would just feel wrong to deliberately not say this stuff, just because it sounds weird and might make SIAI look bad. But that kind of thinking belongs to the Dark Arts, and is probably just a bad habit I had left over from christianity, and isn’t something that SIAI actually endorses, afaik.
And I do, in fact, have lots of skepticism and cynicism about SIAI, and their mission, and the people involved. This skepticism probably would have caused me to abandon them and their mission long ago… if I would have had somewhere better to go instead, or a more important mission. But after years of looking, I haven’t found any cause more important than existential risk reduction, and I haven’t found any group working towards this cause more effectively than SIAI, except possibly for some of the other groups I mentioned, but a preliminary analysis shows that they’re not actually doing any better than SIAI. And starting my own group still looks like a really silly idea.
And yes, I’m aware that I still seem to talk and think like a religious person. I was raised as a christian, and I took christianity very seriously. Seriously enough to realize that it was no good, and that I needed to get out. And so I tried to replace my religious fanaticism with what’s supposed to be an entirely non-religious and non-fanatical cause, but I still tend to think and act both religiously and fanatically. I’m working on that.
I also have an averse reaction to things that look religious. This is one of the many things causing me to have trouble with self-hatred. Anyway, I’m working on that.
Oh, and one more comment about cynicism: I currently think that 1% is an optimistic estimate of the probability that humanity/transhumanity will survive into a positive post-Singularity future, but it’s been a while since I reviewed why I believe this. Another thing to add to my to-do list.