Dark Forest Theories
There’s a concept I first heard in relation to the Fermi Paradox, which I’ve ended up using a lot in other contexts.
Why do we see no aliens out there? A possible (though not necessarily correct) answer, is that the aliens might not want to reveal themselves for fear of being destroyed by larger, older, hostile civilizations. There might be friendly civilizations worth reaching out to, but the upside of finding friendlies is smaller than the downside of risking getting destroyed.
Even old, powerful civilizations aren’t sure that they’re the oldest and most powerful civilization, and eldest civilizations could be orders of magnitude more powerful still.
So, maybe everyone made an individually rational-seeming decision to hide.
A quote from the original sci-fi story I saw describing this:
“The universe is a dark forest. Every civilization is an armed hunter stalking through the trees like a ghost, gently pushing aside branches that block the path and trying to tread without sound. Even breathing is done with care. The hunter has to be careful, because everywhere in the forest are stealthy hunters like him. If he finds another life—another hunter, angel, or a demon, a delicate infant to tottering old man, a fairy or demigod—there’s only one thing he can do: open fire and eliminate them.”
(I consider this a spoiler for the story it’s from, so please don’t bring that up in the comments unless you use spoiler tags[1])
However this applies (or doesn’t) to aliens, I’ve found it useful to have the “Dark Forest” frame in a number of contexts where people are looking at situation, and see something missing, and are confused. “Why is nobody doing X?”, or “Why does X not exist?”. The answer may be that it does exist, but is hidden from you (often on purpose).
I once talked to someone new to my local community saying “nobody is building good group houses that really help people thrive. I’m going to build one and invite people to it.” I said “oh, people are totally building good group houses that help people thrive… but they are private institutions designed to help the people living there. The way they function is by creating a closed boundary where people can build high trust relationships.”
A couple other example areas where Dark Forest Theorizing is relevant:
“Why are there no good meetups?” The answer might be that there are good private meetups, but the public meetups tend to attract people who weren’t invited to private meetups because they were kinda annoying, and this creates an evaporative cooling effect where the people who could potentially make the public meetups good don’t stick around[2].
“Why is no one talking to AI companies to get them to change their ways?” The answer is that people are, but it’s a delicate high-trust operation. AI companies are likely to tune you out if you’re telling them they’re terrible, getting them to actually listen is a delicate operation that requires both trust and understanding their current frame.
In some cases, a thing might not literally be hidden on purpose from you, but it’s absence is still evidence of something systematically important. For example, “Why aren’t the AI people making a giant mass movement to raise public awareness of AI?” Because many versions of a mass movement might turn out to be net-negative (i.e. causing political polarization which makes it harder to get the necessary bipartisan support to pass the relevant laws), and instead the people involved are focused on more narrow lobbying efforts.
(This is not to say that you can’t make good public meetups, or successfully talk to AI companies and get them to change their ways, or that there aren’t good ways to do mass public outreach on AI that are being neglected. But there may be some systemic difficulties you may be missing)
I do realize the cosmic existential threat aspect of the original metaphor is pretty overkill for some of these. The part of the metaphor that feels most resonant to me here is “you’re in a dark place and there’s things you’d maybe expect to see, but don’t, and the reason you don’t see X is specifically because X doesn’t want you to find it.”
I may have more to say on individual Dark Forests in the future. For now, I just want to present this as a model to keep in your back pocket and see where it’s useful.
- Recent and upcoming media related to EA by 28 Mar 2024 20:32 UTC; 104 points) (EA Forum;
- On Shifgrethor by 27 Oct 2024 15:30 UTC; 66 points) (
- Assessment of intelligence agency functionality is difficult yet important by 24 Aug 2023 1:42 UTC; 47 points) (
- 13 Jun 2023 19:28 UTC; 32 points) 's comment on The Dial of Progress by (
- EA is underestimating intelligence agencies and this is dangerous by 26 Aug 2023 16:52 UTC; 28 points) (EA Forum;
- Lesswrong can, and should, become a hacker community by 30 May 2023 0:30 UTC; 11 points) (
- 14 Jan 2025 22:28 UTC; 5 points) 's comment on Social Dark Matter by (
- 27 Mar 2024 1:40 UTC; 3 points) 's comment on Crises reveal centralisation by (EA Forum;
I think this is a useful concept that I use several times a year. I don’t use the term Dark Forest I’m not sure how much that can be attributed to this post, but this post is the only relevant thing in the review so we’ll go with that.
I also appreciate how easy to read and concise this post is. It gives me a vision of how my own writing could be shorter without losing impact.
I might be a niche example, but the Dark Forest Theory as applied to meetups was novel to me and affects how I approach helping rationality meetups.
Sometimes they’re not advertised for good reasons, even if those reasons aren’t articulated. It sure does seem to make accurate claims about meetups from my observation, where when I notice an odd dearth of meetups in an area where it seems like there should be more meetups, sometimes I find out they exist they’re just not as public and also nobody seems to have told the more frustrating quarter of the local community.
It’s a counterintuitive sort of evidence, but it is evidence, and this essay helped me see it clearer. It feels related to Social Dark Matter (https://www.lesswrong.com/posts/KpMNqA5BiCRozCwM3/social-dark-matter) if not exactly the same point and while Social Dark Matter is the more thorough explanation, Dark Forest Theories is more concise.
The followup work I’d like to see is on how to spot these lacuna, and to distinguish “there’s nothing visible here because something is ‘hunting’ the visible examples” and “there’s nothing visible here because there’s actually nothing here.”
Overall, I’d be happy to have this in the Best Of LessWrong collection. A short, well written essay that introduces a new idea you can keep in your back pocket to make sense of the world is a worthwhile addition in my book.