People usually substantially increase the amount of work they do, and generally report higher levels of engagement and very rarely just give up.
In short term, sure. In long term? I look around me and see people so tired of taking precautions against COVID-19 that they would rather die than spend another day wearing a face mask.
In the book, the time intervals were much longer, given the distances in universe, and the speed of light. People were capable of dramatic decisions when the threat was detected. A few years later, with threat still on the way, they were already burned out. Sounds realistic to me.
Where exactly do you see Moloch in the books? It’s quite the opposite if anything; the mature civilizations of the universe have coordinated around cleaning up the cosmos of nascent civilizations, somehow, without a clear coordination mechanism. Or perhaps it’s a Type-1 vulnerable world, but it doesn’t fit well with the author’s argumentation. I’m not sure, and I’m not sure the author knows either.
I’m still a little puzzled by all the praises for the deep game theoretic insights the book series supposedly contains though. Maybe game theory as attire?
There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources. (This doesn’t have a parallel on our planet; you cannot really build a hidden developed country.) Ignoring this part, it is a combination of Type-1 (cheap nukes) and Type-2A (first-strike advantage): once you know the position of your target, you can anonymously send the “nukes” to eliminate them.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason. There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat. But the point is that even if 99% space civilizations preferred peaceful cooperation, it wouldn’t change the outcome—expose yourself, and someone from the remaining 1% will pull the trigger. (And in later books you learn that it’s actually even worse.) This is the part that I call Moloch.
There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources.
Not relevant, if you succeed in hiding you simply fall off the vulnerability landscape. We only need to consider what happens when you’ve been exposed. Also, whose resources? It’s a cosmic commons, so who cares if it gets destroyed.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason.
That’s just Type-1 vulnerable world. No need for the contrived argumentation the author gave.
There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat.
Not really, cleaning up extinction threats is a public good that falls prey to the Tragedy of the Commons. Even if you made the numbers work out somehow—which is very difficult and requires certain conditions that the author has explicitly refuted (like the impossibility to colonize other stars or to send out spam messages) - it would still not be an example of Moloch. It would be an example of pan-galactic coordination, albeit a perverted one.
In short term, sure. In long term? I look around me and see people so tired of taking precautions against COVID-19 that they would rather die than spend another day wearing a face mask.
In the book, the time intervals were much longer, given the distances in universe, and the speed of light. People were capable of dramatic decisions when the threat was detected. A few years later, with threat still on the way, they were already burned out. Sounds realistic to me.
And the “cosmic sociology” is Meditations on Moloch turned up to eleven.
Where exactly do you see Moloch in the books? It’s quite the opposite if anything; the mature civilizations of the universe have coordinated around cleaning up the cosmos of nascent civilizations, somehow, without a clear coordination mechanism. Or perhaps it’s a Type-1 vulnerable world, but it doesn’t fit well with the author’s argumentation. I’m not sure, and I’m not sure the author knows either.
I’m still a little puzzled by all the praises for the deep game theoretic insights the book series supposedly contains though. Maybe game theory as attire?
There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources. (This doesn’t have a parallel on our planet; you cannot really build a hidden developed country.) Ignoring this part, it is a combination of Type-1 (cheap nukes) and Type-2A (first-strike advantage): once you know the position of your target, you can anonymously send the “nukes” to eliminate them.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason. There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat. But the point is that even if 99% space civilizations preferred peaceful cooperation, it wouldn’t change the outcome—expose yourself, and someone from the remaining 1% will pull the trigger. (And in later books you learn that it’s actually even worse.) This is the part that I call Moloch.
Not relevant, if you succeed in hiding you simply fall off the vulnerability landscape. We only need to consider what happens when you’ve been exposed. Also, whose resources? It’s a cosmic commons, so who cares if it gets destroyed.
That’s just Type-1 vulnerable world. No need for the contrived argumentation the author gave.
Not really, cleaning up extinction threats is a public good that falls prey to the Tragedy of the Commons. Even if you made the numbers work out somehow—which is very difficult and requires certain conditions that the author has explicitly refuted (like the impossibility to colonize other stars or to send out spam messages) - it would still not be an example of Moloch. It would be an example of pan-galactic coordination, albeit a perverted one.