This sounds more or less correct to me. Open Philanthropy (Open Phil) is the largest AI safety grant maker and spent over $70 million on AI safety grants in 2022 whereas LTFF only spent ~$5 million. In 2022, the median Open Phil AI safety grant was $239k whereas the median LTFF AI safety grant was only $19k in 2022.
Open Phil and LTFF made 53 and 135 AI safety grants respectively in 2022. This means the average Open Phil AI safety grant in 2022 was ~$1.3 million whereas the average LTFF AI safety grant was only $38k. So the average Open Phil AI safety grant is ~30 times larger than the average LTFF grant.
These calculations imply that Open Phil and LTFF make a similar number of grants (LTFF actually makes more) and that Open Phil spends much more simply because its grants tend to be much larger (~30x larger). So it seems like funds may be more constrained by their ability to evaluate and fulfill grants rather than having a lack of funding. This is not surprising given that the LTFF grantmakers apparently work part-time.
Counterintuitively, it may be easier for an organization (e.g. Redwood Research) to get a $1 million grant from Open Phil than it is for an individual to get a $10k grant from LTFF. The reason why is that both grants probably require a similar amount of administrative effort and a well-known organization is probably more likely to be trusted to use the money well than an individual so the decision is easier to make. This example illustrates how decision-making and grant-making processes are probably just as important as the total amount of money available.
LTFF specifically could be funding-constrained though given that it only spends ~$5 million per year on AI safety grants. Since ~40% of LTFF’s funding comes from Open Phil and Open Phil has much more money than LTFF, one solution is for LTFF to simply ask for more money from Open Phil.
I don’t know why Open Phil spends so much more on AI safety than LTFF (~14x more). Maybe it’s simply because of some administrative hurdles that LTFF has when requesting money from Open Phil or maybe Open Phil would rather make grants directly.
Here is a spreadsheet comparing how much Open Phil, LTFF, and the Survival and Flourishing Fund (SFF) spend on AI safety per year.
Counter point. After the FTX collapse, OpenPhil said publicly (some EA Forum post) that they where raising their bar for funding. I.e. there are things that would have been funded before that would now not be funded. The stated reason for this is that there are generally less money around, in total. To me this sounds like the thing you would do if money is the limitation.
I don’t know why OpenPhil don’t spend more. Maybe they have long timelines and also don’t expect any more big donors any time soon? And this is why they want to spend carefully?
I’d also add that historically I believe about two-thirds of LTFF’s money has also come from OpenPhil, so LTFF doesn’t represent a fully independent funder (though the decisionmaking around grants is pretty independent).
Based on what I’ve written here, my verdict is that AI safety seems more funding constrained for small projects and individuals than it is for organizations for the following reasons: - The funds that fund smaller projects such as LTFF tend to have less money than other funds such as Open Phil which seems to be more focused on making larger grants to organizations (Open Phil spends 14x more per year on AI safety). - Funding could be constrained by the throughput of grant-makers (the number of grants they can make per year). This seems to put funds like LTFF at a disadvantage since they tend to make a larger number of smaller grants so they are more constrained by throughput than the total amount of money available. Low throughput incentivizes making a small number of large grants which favors large existing organizations over smaller projects or individuals. - Individuals or small projects tend to be less well-known than organizations so grants for them can be harder to evaluate or might be more likely to be rejected. On the other hand, smaller grants are less risky. - The demand for funding for individuals or small projects seems like it could increase much faster than it could for organizations because new organizations take time to be created (though maybe organizations can be quickly scaled).
Some possible solutions: - Move more money to smaller funds that tend to make smaller grants. For example, LTFF could ask for more money from Open Phil. - Hire more grant evaluators or hire full-time grant evaluators so that there is a higher ceiling on the total number of grants that can be made per year. - Demonstrate that smaller projects or individuals can be as effective as organizations to increase trust. - Seek more funding: half of LTFF’s funds come from direct donations so they could seek more direct donations. - Existing organizations could hire more individuals rather than the individuals seeking funding themselves. - Individuals (e.g. independent researchers) could form organizations to reduce the administrative load on grant-makers and increase their credibility.
Counterintuitively, it may be easier for an organization (e.g. Redwood Research) to get a $1 million grant from Open Phil than it is for an individual to get a $10k grant from LTFF. The reason why is that both grants probably require a similar amount of administrative effort and a well-known organization is probably more likely to be trusted to use the money well than an individual so the decision is easier to make. This example illustrates how decision-making and grant-making processes are probably just as important as the total amount of money available.
A priori, and talking with some grant-makers, I’d think the split would be around people & orgs who are well-known by the grant-makers, and those who are not well-known by the grant-makers. Why do you think the split is around people vs orgs?
That seems like a better split and there are outliers of course. But I think orgs are more likely to be well-known to grant-makers on average given that they tend to have a higher research output, more marketing, and the ability to organize events. An individual is like an organization with one employee.
But I think orgs are more likely to be well-known to grant-makers on average given that they tend to have a higher research output,
I think your getting the causality backwards. You need money first, before there is an org. Unless you count informal multi people collaborations as orgs.
I think people how are more well-known to grant-makers are more likely to start orgs. Where as people who are less known are more likely to get funding at all, if they aim for a smaller garant, i.e. as an independent researcher.
This sounds more or less correct to me. Open Philanthropy (Open Phil) is the largest AI safety grant maker and spent over $70 million on AI safety grants in 2022 whereas LTFF only spent ~$5 million. In 2022, the median Open Phil AI safety grant was $239k whereas the median LTFF AI safety grant was only $19k in 2022.
Open Phil and LTFF made 53 and 135 AI safety grants respectively in 2022. This means the average Open Phil AI safety grant in 2022 was ~$1.3 million whereas the average LTFF AI safety grant was only $38k. So the average Open Phil AI safety grant is ~30 times larger than the average LTFF grant.
These calculations imply that Open Phil and LTFF make a similar number of grants (LTFF actually makes more) and that Open Phil spends much more simply because its grants tend to be much larger (~30x larger). So it seems like funds may be more constrained by their ability to evaluate and fulfill grants rather than having a lack of funding. This is not surprising given that the LTFF grantmakers apparently work part-time.
Counterintuitively, it may be easier for an organization (e.g. Redwood Research) to get a $1 million grant from Open Phil than it is for an individual to get a $10k grant from LTFF. The reason why is that both grants probably require a similar amount of administrative effort and a well-known organization is probably more likely to be trusted to use the money well than an individual so the decision is easier to make. This example illustrates how decision-making and grant-making processes are probably just as important as the total amount of money available.
LTFF specifically could be funding-constrained though given that it only spends ~$5 million per year on AI safety grants. Since ~40% of LTFF’s funding comes from Open Phil and Open Phil has much more money than LTFF, one solution is for LTFF to simply ask for more money from Open Phil.
I don’t know why Open Phil spends so much more on AI safety than LTFF (~14x more). Maybe it’s simply because of some administrative hurdles that LTFF has when requesting money from Open Phil or maybe Open Phil would rather make grants directly.
Here is a spreadsheet comparing how much Open Phil, LTFF, and the Survival and Flourishing Fund (SFF) spend on AI safety per year.
Counter point. After the FTX collapse, OpenPhil said publicly (some EA Forum post) that they where raising their bar for funding. I.e. there are things that would have been funded before that would now not be funded. The stated reason for this is that there are generally less money around, in total. To me this sounds like the thing you would do if money is the limitation.
I don’t know why OpenPhil don’t spend more. Maybe they have long timelines and also don’t expect any more big donors any time soon? And this is why they want to spend carefully?
I’d also add that historically I believe about two-thirds of LTFF’s money has also come from OpenPhil, so LTFF doesn’t represent a fully independent funder (though the decisionmaking around grants is pretty independent).
Based on what I’ve written here, my verdict is that AI safety seems more funding constrained for small projects and individuals than it is for organizations for the following reasons:
- The funds that fund smaller projects such as LTFF tend to have less money than other funds such as Open Phil which seems to be more focused on making larger grants to organizations (Open Phil spends 14x more per year on AI safety).
- Funding could be constrained by the throughput of grant-makers (the number of grants they can make per year). This seems to put funds like LTFF at a disadvantage since they tend to make a larger number of smaller grants so they are more constrained by throughput than the total amount of money available. Low throughput incentivizes making a small number of large grants which favors large existing organizations over smaller projects or individuals.
- Individuals or small projects tend to be less well-known than organizations so grants for them can be harder to evaluate or might be more likely to be rejected. On the other hand, smaller grants are less risky.
- The demand for funding for individuals or small projects seems like it could increase much faster than it could for organizations because new organizations take time to be created (though maybe organizations can be quickly scaled).
Some possible solutions:
- Move more money to smaller funds that tend to make smaller grants. For example, LTFF could ask for more money from Open Phil.
- Hire more grant evaluators or hire full-time grant evaluators so that there is a higher ceiling on the total number of grants that can be made per year.
- Demonstrate that smaller projects or individuals can be as effective as organizations to increase trust.
- Seek more funding: half of LTFF’s funds come from direct donations so they could seek more direct donations.
- Existing organizations could hire more individuals rather than the individuals seeking funding themselves.
- Individuals (e.g. independent researchers) could form organizations to reduce the administrative load on grant-makers and increase their credibility.
A priori, and talking with some grant-makers, I’d think the split would be around people & orgs who are well-known by the grant-makers, and those who are not well-known by the grant-makers. Why do you think the split is around people vs orgs?
That seems like a better split and there are outliers of course. But I think orgs are more likely to be well-known to grant-makers on average given that they tend to have a higher research output, more marketing, and the ability to organize events. An individual is like an organization with one employee.
I think your getting the causality backwards. You need money first, before there is an org. Unless you count informal multi people collaborations as orgs.
I think people how are more well-known to grant-makers are more likely to start orgs. Where as people who are less known are more likely to get funding at all, if they aim for a smaller garant, i.e. as an independent researcher.