There might be a dark forest phenomenon with anti-totalitarianism activism.
I’ve seen a lot of people in EA saying things like “nobody is trying to prevent totalitarianism right now, therefore it’s neglected” which leads them towards some pretty bonkers beliefs like “preventing totalitarianism is a great high-value way to contributing to making AGI end up aligned”, because they see defending democracy as an endless crusade of being on the right side of history; and in reality, they have no clue that visibly precommitting to a potentially-losing side is something that competent people often avoid, or that the landscape of totalitarianism-prevention might already be pretty saturated by underground networks of rich and powerful people, who have already spent decades being really intense about quietly fighting totalitarianism, and aren’t advertising the details of their underground network of allies and alliances.
In which case, getting EA or AI safety entangled in that web would actually mean a high probability of having everything you care about hijacked, appropriated, used as cannon fodder, or some other way of being steered off a cliff.
The part of the metaphor that feels most resonant to me here is “you’re in a dark place and there’s things you’d maybe expect to see, but don’t, and the reason you don’t see X is specifically because X doesn’t want you to find it.”
I think that instead of ending with “and the reason you don’t see X is specifically because X doesn’t want you to find it”, it makes more sense to end with something more like “and the reason you don’t see X is specifically because X doesn’t want *many unknown randos with a wide variety of capabilities and tendencies* to find it”.
Maybe the other people in the dark forest want to meet people like you! But there’s all sorts of other people out there too, such as opportunistic lawyers and criminals and people much smarter or wealthier or aggressive than them.
… or that the landscape of totalitarianism-prevention might already be pretty saturated by underground networks of rich and powerful people, who have already spent decades being really intense about quietly fighting totalitarianism, and aren’t advertising the details of their underground network of allies and alliances.
I agree this is likely enough and that the opposite is commonly presupposed in a lot of writings. That there are no individuals or groups several standard deviations above the writer in competence and who actually coordinate in a way, unknown to the writer, that would obviate the point they’re trying to convey.
It raises the interesting question of why exactly this happens since popular culture is full of stories of genius scientists, engineers, politicians, military leaders, entrepreneurs, artists, etc…
popular culture is full of stories of genius scientists, engineers, politicians, military leaders, entrepreneurs, artists, etc
I think it’s possible that there are all sorts of reasons why these people could have vanished from public view. For example, maybe most of society’s smartest people have all become socially-awkward corporate executives who already raced to the bottom and currently spend all of their brainpower outmaneuvering each other at office politics, or pumping out dozens of unique galaxy-brained resumes per day. Or maybe most of them have become software engineers who are smart enough to make $200k/y working from home one hour a day while appearing to work eight hours, and spend the rest of their time and energy hooked on major social media platforms (I’ve encountered several people in the Bay Area who do this).
It’s difficult to theorize about invisible geniuses (or the absence of invisible geniuses) because it’s unfalsifiable. But it’s definitely possible that they could either secretly be in a glamorous place, like an underground network steering and coordinating multiple intelligence agencies (in which case all potential opposition might be doomed by default), or an unglamorous place, like socially awkward corporate executives struggling at office politics (office politics aren’t a good fit for them due to social awkwardness), but they keep at it because they’re still too smart to fail at the office politics and were never given a fair chance to find out about superintelligence or human intelligence amplification.
… or an unglamorous place, like socially awkward corporate executives struggling at office politics (office politics aren’t a good fit for them due to social awkwardness), but they keep at it because they’re still too smart to fail at the office politics …
That is a very interesting point. I never even considered before the possibility of someone just smart enough to keep afloat in office politics but not smart enough to transcend it. But when spelled out like that, it seems obvious that there must be a sizable cohort of middle managers and executives that fall into this category.
It does seem doubly tragic if they didn’t even want to do it in the first place or have unsuitable personalities that effectively acts as a glass ceiling, regardless of how much effort they put in.
There might be a dark forest phenomenon with anti-totalitarianism activism.
I’ve seen a lot of people in EA saying things like “nobody is trying to prevent totalitarianism right now, therefore it’s neglected” which leads them towards some pretty bonkers beliefs like “preventing totalitarianism is a great high-value way to contributing to making AGI end up aligned”, because they see defending democracy as an endless crusade of being on the right side of history; and in reality, they have no clue that visibly precommitting to a potentially-losing side is something that competent people often avoid, or that the landscape of totalitarianism-prevention might already be pretty saturated by underground networks of rich and powerful people, who have already spent decades being really intense about quietly fighting totalitarianism, and aren’t advertising the details of their underground network of allies and alliances.
In which case, getting EA or AI safety entangled in that web would actually mean a high probability of having everything you care about hijacked, appropriated, used as cannon fodder, or some other way of being steered off a cliff.
I think that instead of ending with “and the reason you don’t see X is specifically because X doesn’t want you to find it”, it makes more sense to end with something more like “and the reason you don’t see X is specifically because X doesn’t want *many unknown randos with a wide variety of capabilities and tendencies* to find it”.
Maybe the other people in the dark forest want to meet people like you! But there’s all sorts of other people out there too, such as opportunistic lawyers and criminals and people much smarter or wealthier or aggressive than them.
I agree this is likely enough and that the opposite is commonly presupposed in a lot of writings. That there are no individuals or groups several standard deviations above the writer in competence and who actually coordinate in a way, unknown to the writer, that would obviate the point they’re trying to convey.
It raises the interesting question of why exactly this happens since popular culture is full of stories of genius scientists, engineers, politicians, military leaders, entrepreneurs, artists, etc…
I think it’s possible that there are all sorts of reasons why these people could have vanished from public view. For example, maybe most of society’s smartest people have all become socially-awkward corporate executives who already raced to the bottom and currently spend all of their brainpower outmaneuvering each other at office politics, or pumping out dozens of unique galaxy-brained resumes per day. Or maybe most of them have become software engineers who are smart enough to make $200k/y working from home one hour a day while appearing to work eight hours, and spend the rest of their time and energy hooked on major social media platforms (I’ve encountered several people in the Bay Area who do this).
It’s difficult to theorize about invisible geniuses (or the absence of invisible geniuses) because it’s unfalsifiable. But it’s definitely possible that they could either secretly be in a glamorous place, like an underground network steering and coordinating multiple intelligence agencies (in which case all potential opposition might be doomed by default), or an unglamorous place, like socially awkward corporate executives struggling at office politics (office politics aren’t a good fit for them due to social awkwardness), but they keep at it because they’re still too smart to fail at the office politics and were never given a fair chance to find out about superintelligence or human intelligence amplification.
That is a very interesting point. I never even considered before the possibility of someone just smart enough to keep afloat in office politics but not smart enough to transcend it. But when spelled out like that, it seems obvious that there must be a sizable cohort of middle managers and executives that fall into this category.
It does seem doubly tragic if they didn’t even want to do it in the first place or have unsuitable personalities that effectively acts as a glass ceiling, regardless of how much effort they put in.