I expect EA Funds – and the Long-Term Future Fund in particular – to be of interest to people on LessWrong, so I’m crossposting my EA Forum post with the excerpts that seem most relevant:
Summary
The Animal Welfare Fund, the Long-Term Future Fund, and the EA Infrastructure Fund (formerly the EA Meta Fund) are calling for applications.
Applying is fast and easy – it typically takes less than a few hours. If you are unsure whether to apply, simply give it a try.
The Long-Term Future Fund and EA Infrastructure Fund now support anonymized grants: if you prefer not having your name listed in the public payout report, we are still interested in funding you.
If you have a project you think will improve the world, and it seems like a good fit for one of our funds, we encourage you to apply by 7 March (11:59pm PST). Apply here. We’d be excited to hear from you!
Recent updates
The Long-Term Future Fund and EA Infrastructure Fund now officially support anonymized grants. To be transparent towards donors and the effective altruism community, we generally prefer to publish a report about your grant, with your name attached to it. But if you prefer we do not disclose any of your personal information, you can now choose one of the following options: 1) Requesting that the public grant report be anonymized. In this case, we will consider your request, but in some cases, we may end up asking you to choose between a public grant or none at all. 2) Requesting we do not publish a public grant report of any kind. In this case, if we think the grant is above our threshold for funding, we will refer it to private funders.
(…)
Long-Term Future Fund
The Long-Term Future Fund aims to positively influence the long-term trajectory of civilization, primarily via making grants that contribute to the mitigation of global catastrophic risks. Historically, we’ve funded a variety of longtermist projects, including:
Scholarships, academic teaching buy-outs, and additional funding for academics to free up their time
Funding to make existing researchers more effective
Direct work in AI, biosecurity, forecasting, and philanthropic timing
Up-skilling in a field to prepare for future work
Seed money for new organizations
Movement-building programs
See our previous grants here. Most of our grants are reported publicly, but we also give applicants the option to receive an anonymous grant, or to be referred to a private donor.
The fund has an intentionally broad remit that encompasses a wide range of potential projects. We strongly encourage anyone who thinks they could use money to benefit the long-term future to apply.
(…)
What types of grants can we fund?
For grants to individuals, all of our funds can likely make the following types of grants:
Events/workshops
Scholarships
Self-study
Research projects
Content creation
Product creation (e.g., a tool/resource that can be used by the community)
We can refer applications for for-profit projects (e.g., seed funding for start-ups) to EA-aligned investors. If you are a for-profit, simply apply through the standard application form and indicate your for-profit status in the application.
For legal reasons, we will likely not be able to make the following types of grants:
Grantseekers requesting funding for a list of possible projects
In this case, we would fund only a single project of the proposed ones. Feel free to apply with multiple projects, but we will have to confirm a specific project before we issue funding.
Self-development that is not directly related to the common good
In order to make grants, the public benefit needs to be greater than the private benefit to any individual. So we cannot make grants that focus on helping a single individual in a way that is not directly connected to public benefit.
Please err on the side of applying, as it is likely we will be able to make something work if the fund managers are excited about the project. We look forward to hearing from you.
I am currently writing fiction that features protagonists that are EAs.
This seems at least related to the infrastructure fund goal of presenting EA principles and exposing more people to them.
I think receiving a grant would make me more likely to aggressively pursue options to professionally edit, publish, and publicize the work. That feels kind of selfish and makes me self-conscious, but also wouldn’t require a very large grant. It’s hard for me to unwrap my feelings about this vs. the actual public good, so I’m asking here first.
Does this sounds like a good grant use?
I am reasonably excited about fiction (and am on the Long Term Future Fund). I have written previously about my thoughts on fiction here:
I’ve got some partial outlines for what I think are interesting sci-fi that I’ve wanted to pay to have ghostwritten or turned into a short film. Is this the right place for that?
Maybe, but really depends on whether you have a good track record or there is some other reason why it seems like a good idea to fund from an altruistic perspective.
I largely agree with Habryka’s perspective. I personally (not speaking on behalf of the EA Infrastructure Fund) would be particularly interested in such a grant if you had a track record of successful writing, as this would make it more likely you’d actually reach a large audience. E.g., Eliezer did not just write HPMoR but was a successful blogger on Overcoming Bias and wrote the sequences.