There are claims that roughly 10% of AI researchers fall into this category. I do not know of any formal surveys on the question. I do know we keep getting examples.
In a world with both powerful propaganda, and where AI accelerationism was established to be in-line with US national security interests, these observations would not be a surprising occurrence. By the 2020s, prioritizing the targeting of elites is a standard element for influence operations.
Zvi is talking about Richard Sutton’s embrace of the outright replacement of humanity by AI. I don’t think that is the kind of accelerationism that wins adherents among most elites...?
In a world with both powerful propaganda, and where AI accelerationism was established to be in-line with US national security interests, these observations would not be a surprising occurrence. By the 2020s, prioritizing the targeting of elites is a standard element for influence operations.
Zvi is talking about Richard Sutton’s embrace of the outright replacement of humanity by AI. I don’t think that is the kind of accelerationism that wins adherents among most elites...?