In case you didn’t already know: The Future of Humanity Institute, one of the three organizations co-sponsoring LW, is a group within the University of Oxford’s philosophy department that tackles important, large-scale problems for humanity like how to go about reducing existential risk.
I’ve been casually corresponding with the FHI in an effort to learn more about the different options available for purchasing existential risk reduction. Here’s a summary of what I’ve learned from research fellow Stuart Armstrong and academic project manager Sean O’Heigeartaigh:
Sean reports that since this SIAI/FHI achievements comparison, FHI’s full-time research team has expanded to 7, the biggest it’s ever been. Sean writes: “Our output has improved dramatically by all tangible metrics (academic papers, outreach, policy impact, etc) to match this.”
Despite this, Sean writes, “we’re not nearly at the capacity we’d like to reach. There are a number of research areas in which we would very like to expand (more machine intelligence work, synthetic biology risks, surveillance/information society work) and in which we feel that we could make a major impact. There are also quite a number of talented researchers over the past year who we haven’t been able to employ but would dearly like to.”
They’d also like to do more public outreach, but standard academic funding routes aren’t likely to cover this. So without funding from individuals, it’s much less likely to happen.
Sean is currently working overtime to cover a missing administrative staff member, but he plans to release a new achievement report (see sidebar on this page for past achievement reports) sometime in the next few months.
Although the FHI has traditionally pursued standard academic funding channels, donations from individuals (small and large) are more than welcome. (Stuart says this can’t be emphasized enough.)
Stuart reports current academic funding opportunities are “a bit iffy, with some possible hopes”.
Sean is more optimistic than Stuart regarding near-term funding prospects, although he does mention that both Stuart and Anders Sandberg are currently being covered by FHI’s “non-assigned” funding until grants for them can be secured.
Although neither Stuart nor Sean mentions this, I assume that one reason individual donations can be especially valuable is if they free FHI researchers up from writing grant proposals so they can spend more time doing actual research.
Interesting comment by lukeprog describing the comparative advantages of SIAI and FHI.
Room for more funding at the Future of Humanity Institute
In case you didn’t already know: The Future of Humanity Institute, one of the three organizations co-sponsoring LW, is a group within the University of Oxford’s philosophy department that tackles important, large-scale problems for humanity like how to go about reducing existential risk.
I’ve been casually corresponding with the FHI in an effort to learn more about the different options available for purchasing existential risk reduction. Here’s a summary of what I’ve learned from research fellow Stuart Armstrong and academic project manager Sean O’Heigeartaigh:
Sean reports that since this SIAI/FHI achievements comparison, FHI’s full-time research team has expanded to 7, the biggest it’s ever been. Sean writes: “Our output has improved dramatically by all tangible metrics (academic papers, outreach, policy impact, etc) to match this.”
Despite this, Sean writes, “we’re not nearly at the capacity we’d like to reach. There are a number of research areas in which we would very like to expand (more machine intelligence work, synthetic biology risks, surveillance/information society work) and in which we feel that we could make a major impact. There are also quite a number of talented researchers over the past year who we haven’t been able to employ but would dearly like to.”
They’d also like to do more public outreach, but standard academic funding routes aren’t likely to cover this. So without funding from individuals, it’s much less likely to happen.
Sean is currently working overtime to cover a missing administrative staff member, but he plans to release a new achievement report (see sidebar on this page for past achievement reports) sometime in the next few months.
Although the FHI has traditionally pursued standard academic funding channels, donations from individuals (small and large) are more than welcome. (Stuart says this can’t be emphasized enough.)
Stuart reports current academic funding opportunities are “a bit iffy, with some possible hopes”.
Sean is more optimistic than Stuart regarding near-term funding prospects, although he does mention that both Stuart and Anders Sandberg are currently being covered by FHI’s “non-assigned” funding until grants for them can be secured.
Although neither Stuart nor Sean mentions this, I assume that one reason individual donations can be especially valuable is if they free FHI researchers up from writing grant proposals so they can spend more time doing actual research.
Interesting comment by lukeprog describing the comparative advantages of SIAI and FHI.