SI’s Summer 2012 Matching Drive Ends July 31st
The Singularity Institute’s summer 2012 matching drive ends on July 31st! Donate by the end of the month to have your gift matched, dollar for dollar.
As of this posting, SI has raised $70,000 of the $150,000 goal.
The announcement says:
Since we published our strategic plan in August 2011, we have achieved most of the near-term goals outlined therein…In the coming year, the Singularity Institute plans to do the following:
If you’re planning to earmark your donation to CFAR (Center for Applied Rationality), here’s a preview of what CFAR plans to do in the next year:
Hold our annual Singularity Summit, this year in San Francisco!
Spin off the Center for Applied Rationality as a separate organization focused on rationality training, so that the Singularity Institute can be focused more exclusively on Singularity research and outreach.
Publish additional research on AI risk and Friendly AI.
Eliezer will write an “Open Problems in Friendly AI” sequence for Less Wrong. (For news on his rationality books, see here.)
Finish Facing the Singularity and publish ebook versions of Facing the Singularity and The Sequences, 2006-2009.
And much more! For details on what we might do with additional funding, see How to Purchase AI Risk Reduction.
Develop additional lessons teaching the most important and useful parts of rationality. CFAR has already developed and tested over 18 hours of lessons so far, including classes on how to evaluate evidence using Bayesianism, how to make more accurate predictions, how to be more efficient using economics, how to use thought experiments to better understand your own motivations, and much more.
Run immersive rationality retreats to teach from our curriculum and to connect aspiring rationalists with each other. CFAR ran pilot retreats in May and June. Participants in the May retreat called it “transformative” and “astonishing,” and the average response on the survey question, “Are you glad you came? (1-10)” was a 9.4. (We don’t have the June data yet, but people were similarly enthusiastic about that one.)
Run SPARC, a camp on the advanced math of rationality for mathematically gifted high school students. CFAR has a stellar first-year class for SPARC 2012; most students admitted to the program placed in the top 50 on the USA Math Olympiad (or performed equivalently in a similar contest).
Collect longitudinal data on the effects of rationality training, to improve our curriculum and to generate promising hypotheses to test and publish, in collaboration with other researchers. CFAR has already launched a one-year randomized controlled study tracking reasoning ability and various metrics of life success, using participants in our June minicamp and a control group.
Develop apps and games about rationality, with the dual goals of (a) helping aspiring rationalists practice essential skills, and (b) making rationality fun and intriguing to a much wider audience. CFAR has two apps in beta testing: one training players to update their own beliefs the right amount after hearing other people’s beliefs, and another training players to calibrate their level of confidence in their own beliefs. CFAR is working with a developer on several more games training people to avoid cognitive biases.
In another post, I compared the goals in our August 2011 strategic plan to our current situation, summarizing:
That’s it for the main list! Now let’s check in on what we said our top priorities for 2011-2012 were:
Public-facing research on creating a positive singularity. Check. SI has more peer-reviewed publications in 2012 than in all past years combined.
Outreach / education / fundraising. Check. Especially, through CFAR.
Improved organizational effectiveness. Check. Lots of good progress on this.
Singularity Summit. Check.
In summary, I think SI is a bit behind where I hoped we’d be by now, though this is largely because we’ve poured so much into launching CFAR, and as a result, CFAR has turned out to be significantly more cool at launch than I had anticipated.
Fundraising has been a challenge. One donor failed to actually give their $46,000 pledge despite repeated reminders and requests, and our support base is (understandably) anxious to see a shift from movement-building work to FAI research, a shift I have been fighting for since I was made Executive Director. (Note that spinning off rationality work to CFAR is a substantial part of trimming SI down into being primarily an FAI research institute.)
Reforming SI into a more efficient, effective organization has been my greatest challenge. Frankly, SI was in pretty bad shape when Louie and I arrived as interns in April 2011, and there have been an incredible number of holes to dig SI out of — and several more remain. (In contrast, it has been a joy to help set up CFAR properly from the very beginning, with all the right organizational tools and processes in place.) Reforming SI presents a fundraising problem, because reforming SI is time consuming and sometimes costly, but is generally unexciting to donors.
I can see the light at the end of the tunnel, though. We won’t reach it if we can’t improve our fundraising success in the next 3-6 months, but it’s close enough that I can see it. SI’s path forward, from my point of view, looks like this:
We finish launching CFAR, which takes over the rationality work SI was doing. (Before January 2013.)
We change how the Singularity Summit is planned and run so that it pulls our core staff away from core mission work to a lesser degree. (Before January 2013.)
Eliezer writes the “Open Problems in Friendly AI” sequence. (Before January 2013.)
We hire 1-2 researchers to produce technical write-ups from Eliezer’s TDT article and from his “Open Problems in Friendly AI” sequence. (Beginning September 2012, except that right now we don’t have the cash to hire the 1-2 people who I know who could do this and who want to do this as soon as we have the money to hire them.)
With the “Open FAI Problems” sequence and the technical write-ups in hand, we greatly expand our efforts to show math/compsci researchers that there is a tractable, technical research program in FAI theory, and as a result some researchers work on the sexiest of these problems from their departments, and some other math researchers take more seriously the prospect of being hired by SI to do technical research in FAI theory. (Beginning, roughly, in April 2013.) Also: There won’t be classes on x-risk at SPARC (rationality camp for young elite math talent), but some SPARC students might end up being interested in FAI stuff by osmosis.
With a more tightly honed SI, improved fundraising practices, and visible mission-central research happening, SI is able to attract more funding and hire even more FAI researchers. (Beginning, roughly, in September 2013.)
If you want to help us make this happen, please donate during our July matching drive!
- 31 Aug 2012 16:22 UTC; -10 points) 's comment on Dealing with trolling and the signal to noise ratio by (
Buy one Singularity, get one free!
I have donated $6000 to CFAR, an impressive organization with great potential to pay back significant dividends to SIAI in the form of recruiting top talent for an FAI team, and increasing the breadth and depth of the donor and volunteer pools.
I thank Luke, Julia, and Anna for discussing with me which organization most needs my donation, and I would not that although all the funds for the drive have been matched (yay! good job!), both CFAR and SIAI are in a position to get significant value for marginal donations. SIAI has identified several promising mathematicians who could potentially form an FAI team, and has plans to get them to help produce recruitment materials for more potential FAI researchers. These plans take money. CFAR has run some excellent mini camps, and I think they can do even better with more people with more insights into human rationality and how to teach people to actually do it. Those people cost money. So please help support their efforts, and make yourselves stronger so you can support their efforts more.
I know we have heard a lot about how easy it is to screw up the Singularity, and then it is all over. But, after spending a week at CFAR’s minicamp and talking to so many people in the thick of dealing with the problem, I have real hope and that we really can get this right and win, and every bit of support makes this more likely. So, let’s get our acts together, humanity, and take advantage of the huge opportunity and challenge we face right now.
$1000
I have donated $1000 thanks to the nice people at Personalized Medicine.
Is your paper available for download somewhere? I’d like to read it but I failed to download it before their site went 404.
The papers were, naturally, entered into my archive system; so I have local copies of all of them, and while the Internet Archive copies unaccountably all failed, the WebCite copies seem to work—Yvain’s is http://webcitation.org/65XQfSVTI
Thanks!
Whatever happened to them, anyway?
I’ve given 2x the amount that I gave on previous funding drives. You seem to be maturing as an organization and I’ve also been reasonably impressed by the attitude on display in the responses to Holden’s criticisms.
Is SPARC actually a good acronym to use? Although it’ll catch the eye of the slightly older geek in a hopefully useful manner, the Scholarly Publishing and Academic Resources Coalition (a fine body that makes the world a slightly better place) is getting politically active.
Yeah, we might change the name after the first SPARC.
Thanks for posting this here. I hadn’t been keeping tabs on the SIAI site itself and hadn’t noticed the whole matching drive until this post.
Does the SI get more money if you donate via Google Checkout verses Paypal?
Google Checkout is very slightly preferred over Paypal. Both charge fees now, so check is still better for large donations.
More info on this would be appreciated. I was under the impression that Google Checkout was free to charities.
They kept extending it, but Google Checkout started charging fees for non-profits at the end of last month.
I hope so, because Google Checkout is more convenient.