The primary reason I think SI should be supported is that I like what the organization actually does, and wish it to continue. The Less Wrong Sequences, Singularity Summit, rationality training camps, and even HPMoR and Less Wrong itself are all worth paying some amount of money for.
I think that my own approach is similar, but with a different emphasis. I like some of what they’ve done, so my question is how do encourage those pieces. This article was very helpful in prompting some thought into how to handle that. I generally break down their work into three categories:
Rationality (minicamps, training, LW, HPMoR):
Here I think they’ve done some very good work. Luckily, the new spinoff will allow me to support these pieces directly.
Existential risk awareness (singularity summit, risk analysis articles):
Here their record has been mixed. I think the Singularity Summit has been successful, other efforts less so but seemingly improving. I can support the Singularity Summit by continuing to attend and potentially donating directly if necessary (since it’s been running positive in recent years, for the moment this does not seem necessary).
Original research (FAI, timeless decision theory):
This is the area where I do not find them to be at all effective. From what I’ve read, there seems a large disconnect between ambitions and capabilities. Given that I can now support the other pieces separately, this is why I would not donate generally to SIAI.
My overall view would be that, at present, there is no real organization to support. Rather there is a collection of talented people whose freedom to work on interesting things I’m supporting. Given that, I want to support those people where I think they are effective.
I find Eliezer in particular to be one of the best pop-science writers around (and I most assuredly do not mean that term as an insult). Things like the sequences or HPMoR are thought-provoking and worth supporting. I find the general work on rationality to be critically important and timely.
So, while I agree that much of the work being done is valuable, my conclusion has been to consider how to support that directly rather than SI in general.
I don’t see how this constitutes a “different emphasis” from my own. Right now, SI is the way one supports the activities in question. Once the spinoff has finally spun off and can take donations itself, it will be possible to support the rationality work directly.
The different emphasis comes down to your comment that:
...they support SI despite not agreeing with SI’s specific arguments. Perhaps you should, too...
In my opinion, I can more effectively support those activities that I think are effective by not supporting SI. Waiting until the Center for Applied Rationality gets its tax-exempt status in place allows me to both target my donations and directly signal where I think SI has been most effective up to this point.
If they end up having short-term cashflow issues prior to that split, my first response would be to register for the next Singularity Summit a bit early since that’s another piece that I wish to directly support.
I think that my own approach is similar, but with a different emphasis. I like some of what they’ve done, so my question is how do encourage those pieces. This article was very helpful in prompting some thought into how to handle that. I generally break down their work into three categories:
Rationality (minicamps, training, LW, HPMoR): Here I think they’ve done some very good work. Luckily, the new spinoff will allow me to support these pieces directly.
Existential risk awareness (singularity summit, risk analysis articles): Here their record has been mixed. I think the Singularity Summit has been successful, other efforts less so but seemingly improving. I can support the Singularity Summit by continuing to attend and potentially donating directly if necessary (since it’s been running positive in recent years, for the moment this does not seem necessary).
Original research (FAI, timeless decision theory): This is the area where I do not find them to be at all effective. From what I’ve read, there seems a large disconnect between ambitions and capabilities. Given that I can now support the other pieces separately, this is why I would not donate generally to SIAI.
My overall view would be that, at present, there is no real organization to support. Rather there is a collection of talented people whose freedom to work on interesting things I’m supporting. Given that, I want to support those people where I think they are effective.
I find Eliezer in particular to be one of the best pop-science writers around (and I most assuredly do not mean that term as an insult). Things like the sequences or HPMoR are thought-provoking and worth supporting. I find the general work on rationality to be critically important and timely.
So, while I agree that much of the work being done is valuable, my conclusion has been to consider how to support that directly rather than SI in general.
I don’t see how this constitutes a “different emphasis” from my own. Right now, SI is the way one supports the activities in question. Once the spinoff has finally spun off and can take donations itself, it will be possible to support the rationality work directly.
The different emphasis comes down to your comment that:
In my opinion, I can more effectively support those activities that I think are effective by not supporting SI. Waiting until the Center for Applied Rationality gets its tax-exempt status in place allows me to both target my donations and directly signal where I think SI has been most effective up to this point.
If they end up having short-term cashflow issues prior to that split, my first response would be to register for the next Singularity Summit a bit early since that’s another piece that I wish to directly support.
So, are you saying you’d be more inclined to fund a Rationality Institute?