It feels sorta understandable to me (albeit frustrating) that OpenPhil faces these assorted political constraints. In my view this seems to create a big unfilled niche in the rationalist ecosystem: a new, more right-coded, EA-adjacent funding organization could optimize itself for being able to enter many of those blacklisted areas with enthusiasm.
If I was a billionare, I would love to put together a kind of “completion portfolio” to complement some of OP’s work. Rationality community building, macrostrategy stuff, AI-related advocacy to try and influence republican politicians, plus a big biotechnology emphasis focused on intelligence enhancement, reproductive technologies, slowing aging, cryonics, gene drives for eradicating diseases, etc. Basically it seems like there is enough edgy-but-promising stuff out there (like studying geoengineering for climate, or advocating for charter cities, or just funding oddball substack intellectuals to do their thing) that you could hope to create a kind of “alt-EA” (obviously IRL it shouldn’t have EA in the name) where you batten down the hatches, accept that the media will call you an evil villain mastermind forever, and hope to create a kind of protective umbrella for all the work that can’t get done elsewhere. As a bonus, you could engage more in actual politics (like having some hot takes on the US budget deficit, or on how to increase marriage & fertility rates, or whatever), in some areas that OP in its quest for center-left non-polarization can’t do.
Peter Thiel already lives this life, kinda? But his model seems 1. much more secretive, and 2. less directly EA-adjacent, than what I’d try if I was a billionare.
Dustin himself talks about how he is really focused on getting more “multipolarity” to the EA landscape, by bringing in other high-net-worth funders. For all the reasons discussed, he obviously can’t say “hi, somebody please start an edgier right-wing offshoot of EA!!” But it seems like a major goal that the movement should have, nonetheless.
Seems like you could potentially also run this play with a more fully-left-coded organization. The gains there would probably be smaller, since there’s less “room” to OP’s left than to their right. But maybe you could group together wild animal welfare, invertebrate welfare, digital minds, perhaps some David Pearce / Project Far Out-style “suffering abolition” transhumanist stuff, other mental-wellbeing stuff like the Organization for the Prevention of Intense Suffering, S-risk work, etc. Toss in some more aggressive political activism on AI (like PauseAI) and other issues (like Georgist land value taxation), and maybe some forward-looking political stuff on avoiding stable totalitarianism, regulation of future AI-enabled technologies, and how to distribute the gains from a positive / successful singularity (akin to Sam Altman’s vision of UBI supported by georgist/pigouvian taxes, but more thought-through and detailed and up-to-date.)
Finding some funders to fill these niches seems like it should be a very high priority of the rationalist / EA movement. Even if the funders were relatively small at first (like say they have $10M - $100M in crypto that they are preparing to give away), I think there could be a lot of value in being “out and proud” (publicising much of their research and philosophy and grantmaking like OP, rather than being super-secretive like Peter Thiel). If a small funder manages to build a small successful “alt-EA” ecosystem on either the left or right, that might attract larger funders in time.
It feels sorta understandable to me (albeit frustrating) that OpenPhil faces these assorted political constraints. In my view this seems to create a big unfilled niche in the rationalist ecosystem: a new, more right-coded, EA-adjacent funding organization could optimize itself for being able to enter many of those blacklisted areas with enthusiasm.
If I was a billionare, I would love to put together a kind of “completion portfolio” to complement some of OP’s work. Rationality community building, macrostrategy stuff, AI-related advocacy to try and influence republican politicians, plus a big biotechnology emphasis focused on intelligence enhancement, reproductive technologies, slowing aging, cryonics, gene drives for eradicating diseases, etc. Basically it seems like there is enough edgy-but-promising stuff out there (like studying geoengineering for climate, or advocating for charter cities, or just funding oddball substack intellectuals to do their thing) that you could hope to create a kind of “alt-EA” (obviously IRL it shouldn’t have EA in the name) where you batten down the hatches, accept that the media will call you an evil villain mastermind forever, and hope to create a kind of protective umbrella for all the work that can’t get done elsewhere. As a bonus, you could engage more in actual politics (like having some hot takes on the US budget deficit, or on how to increase marriage & fertility rates, or whatever), in some areas that OP in its quest for center-left non-polarization can’t do.
Peter Thiel already lives this life, kinda? But his model seems 1. much more secretive, and 2. less directly EA-adjacent, than what I’d try if I was a billionare.
Dustin himself talks about how he is really focused on getting more “multipolarity” to the EA landscape, by bringing in other high-net-worth funders. For all the reasons discussed, he obviously can’t say “hi, somebody please start an edgier right-wing offshoot of EA!!” But it seems like a major goal that the movement should have, nonetheless.
Seems like you could potentially also run this play with a more fully-left-coded organization. The gains there would probably be smaller, since there’s less “room” to OP’s left than to their right. But maybe you could group together wild animal welfare, invertebrate welfare, digital minds, perhaps some David Pearce / Project Far Out-style “suffering abolition” transhumanist stuff, other mental-wellbeing stuff like the Organization for the Prevention of Intense Suffering, S-risk work, etc. Toss in some more aggressive political activism on AI (like PauseAI) and other issues (like Georgist land value taxation), and maybe some forward-looking political stuff on avoiding stable totalitarianism, regulation of future AI-enabled technologies, and how to distribute the gains from a positive / successful singularity (akin to Sam Altman’s vision of UBI supported by georgist/pigouvian taxes, but more thought-through and detailed and up-to-date.)
Finding some funders to fill these niches seems like it should be a very high priority of the rationalist / EA movement. Even if the funders were relatively small at first (like say they have $10M - $100M in crypto that they are preparing to give away), I think there could be a lot of value in being “out and proud” (publicising much of their research and philosophy and grantmaking like OP, rather than being super-secretive like Peter Thiel). If a small funder manages to build a small successful “alt-EA” ecosystem on either the left or right, that might attract larger funders in time.