Your posts on SIAI have had a veneer of evenhandedness and fairness, and that continues here. But given what you don’t say in your posts, I cannot avoid the impression that you started out with the belief that SIAI was not a credible charity and rather than investigating the evidence both for and against that belief, you have marshaled the strongest arguments against donating to SIAI and ignored any evidence in favor of donating to SIAI. I almost hesitate to link to EY lest you dismiss me as one of his acolytes, but see, for example, A Rational Argument.
In your top-level posts you have eschewed references to any of the publicly visible work that SIAI does such as the Summit and the presentation and publication of academic papers. Some of this work is described at this link to SIAI’s description of its 2009 achievements. The 2010 Summit is described here. As for Eliezer’s current project, at the 2009 achievements link, SIAI has publicized the fact that he is working on a book on rationality:
Yudkowsky is now converting his blog sequences into the planned rationality book, which he hopes will significantly assist in attracting and inspiring talented individuals to effectively work towards the aims of a beneficial Singularity and reduced existential risk.
You could have chosen to make part of your evaluation of SIAI an analysis of whether or not EY’s book will ultimately be successful in this goal or whether it’s the most valuable work that EY should be doing to reduce existential risk, but I’m not sure how his work on transforming the fully public LW sequences into a book is insufficiently transparent or not something for which he and SIAI can be held accountable when it is published.
Moreover, despite your professed interest in existential risk reduction and references in others’ comments to your posts about the Future of Humanity Institute at Oxford, you suggest donating to Givewell-endorsed charities as an alternative to SIAI donations without even a mention of FHI as a possible alternative in the field of existential risk reduction. Perhaps you find FHI equally non-credible/non-accountable as a charity, but whatever FHI’s failings, it’s hard to see how they are exactly the same ones which you have ascribed to SIAI. Perhaps you believe that if a charity has not been evaluated and endorsed by Givewell, it can’t possibly be worthwhile. I can’t avoid the thought that if you were really interested in existential risk reduction, you would spend at least some tiny percentage of the time you’ve spent writing up these posts against SIAI on investigating FHI as an alternative.
I would be happy to engage with you or others on the site in a fair and unbiased examination of the case for and against SIAI (and/or FHI, the Foresight Institute, the Lifeboat Foundation, etc.). Although I may come across as strongly biased in favor of SIAI in this comment, I have my own concerns about SIAI’s accountability and public relations, and have had numerous conversations with those within the organization about those concerns. But with limited time on my hands and faced with such a one-sided and at times even polemical presentation from you, I find myself almost forced into the role of SIAI defender, so that I can least provide some of the positive information about SIAI that you leave out.
I agree it would be good if more info on finances was readily available. There are tax returns (although I think the most recent is 2008) available on Guidestar (with free registration). But as for leadership structure, is this link the sort of thing you had in mind or were you looking for an actual org chart or something?