There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources. (This doesn’t have a parallel on our planet; you cannot really build a hidden developed country.) Ignoring this part, it is a combination of Type-1 (cheap nukes) and Type-2A (first-strike advantage): once you know the position of your target, you can anonymously send the “nukes” to eliminate them.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason. There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat. But the point is that even if 99% space civilizations preferred peaceful cooperation, it wouldn’t change the outcome—expose yourself, and someone from the remaining 1% will pull the trigger. (And in later books you learn that it’s actually even worse.) This is the part that I call Moloch.
There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources.
Not relevant, if you succeed in hiding you simply fall off the vulnerability landscape. We only need to consider what happens when you’ve been exposed. Also, whose resources? It’s a cosmic commons, so who cares if it gets destroyed.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason.
That’s just Type-1 vulnerable world. No need for the contrived argumentation the author gave.
There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat.
Not really, cleaning up extinction threats is a public good that falls prey to the Tragedy of the Commons. Even if you made the numbers work out somehow—which is very difficult and requires certain conditions that the author has explicitly refuted (like the impossibility to colonize other stars or to send out spam messages) - it would still not be an example of Moloch. It would be an example of pan-galactic coordination, albeit a perverted one.
There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources. (This doesn’t have a parallel on our planet; you cannot really build a hidden developed country.) Ignoring this part, it is a combination of Type-1 (cheap nukes) and Type-2A (first-strike advantage): once you know the position of your target, you can anonymously send the “nukes” to eliminate them.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason. There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat. But the point is that even if 99% space civilizations preferred peaceful cooperation, it wouldn’t change the outcome—expose yourself, and someone from the remaining 1% will pull the trigger. (And in later books you learn that it’s actually even worse.) This is the part that I call Moloch.
Not relevant, if you succeed in hiding you simply fall off the vulnerability landscape. We only need to consider what happens when you’ve been exposed. Also, whose resources? It’s a cosmic commons, so who cares if it gets destroyed.
That’s just Type-1 vulnerable world. No need for the contrived argumentation the author gave.
Not really, cleaning up extinction threats is a public good that falls prey to the Tragedy of the Commons. Even if you made the numbers work out somehow—which is very difficult and requires certain conditions that the author has explicitly refuted (like the impossibility to colonize other stars or to send out spam messages) - it would still not be an example of Moloch. It would be an example of pan-galactic coordination, albeit a perverted one.