Epistemic status: Haven’t read the book, so take it with some piles of salt.
Everything I heard about the Three Body Problem gets the sociology wrong, and seems to fail to model what humans actually do in crises. At least what I heard so far, is that it really reinforces the “humans fall into despair when faced with crises” when that’s really the opposite of what we know happens in real humanitarian crises. People usually substantially increase the amount of work they do, and generally report higher levels of engagement and very rarely just give up.
I think part of the value of Three Body is how foreign and yet how viscerally plausible the sociology is (even if you think that it gets things wrong, it still feels like something that could happen). Though the series mostly takes place in the future, it begins in the real, historical Cultural Revolution. It seems to me as if the whole sociology of the book is an extrapolation from what happened during the Cultural Revolution – a situation that saw large groups of humans pushed to their breaking point, and yet displayed failure modes very different from what Westerners are accustomed to.
One scene that stands out to me in this regard is when a whole population is trying to flee, but there’s only one ship that has the technology needed to get away in time. When the people realize this, they all try to shoot the ship down. To me this seems very much a collectivist thing to me – someone from an individualist culture might focus on people wanting to escape on that ship themselves; but the people in Three Body just want to equalize things by pulling the outlier down with them. Relatedly, I think that the reason the series as a whole feels so focused on sociology is that it is entirely about groups of people and never really about individuals at all. (Sure there are individual characters, but they mostly just function as vehicles for the story and/or ways to explore morality.)
Overall I disagree with Robin Hanson that Liu ‘gets away with things’ just because he’s Chinese. I think a large part of the value of his books is that they’re entirely Chinese in their viewpoint. While I believe in human universals, I think Westerners generally underestimate how differently from them many Chinese people think. Chinese people and Westerners may react very differently to situations, and that might explain much of why the sociology feels so wrong to readers like Robin Hanson. In any case it’s certainly fascinating and you should read it.
I have read the trilogy and I second habryka’s comment. One non-too-spoily example is that at several points, humanity is seen as a cultural monolith: all of humanity is described as having a single reaction to an event, no variation between countries or at least cultural blocks.
(I greatly enjoyed the books and would recommend them to anyone who likes SF despite the criticism above.)
People usually substantially increase the amount of work they do, and generally report higher levels of engagement and very rarely just give up.
In short term, sure. In long term? I look around me and see people so tired of taking precautions against COVID-19 that they would rather die than spend another day wearing a face mask.
In the book, the time intervals were much longer, given the distances in universe, and the speed of light. People were capable of dramatic decisions when the threat was detected. A few years later, with threat still on the way, they were already burned out. Sounds realistic to me.
Where exactly do you see Moloch in the books? It’s quite the opposite if anything; the mature civilizations of the universe have coordinated around cleaning up the cosmos of nascent civilizations, somehow, without a clear coordination mechanism. Or perhaps it’s a Type-1 vulnerable world, but it doesn’t fit well with the author’s argumentation. I’m not sure, and I’m not sure the author knows either.
I’m still a little puzzled by all the praises for the deep game theoretic insights the book series supposedly contains though. Maybe game theory as attire?
There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources. (This doesn’t have a parallel on our planet; you cannot really build a hidden developed country.) Ignoring this part, it is a combination of Type-1 (cheap nukes) and Type-2A (first-strike advantage): once you know the position of your target, you can anonymously send the “nukes” to eliminate them.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason. There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat. But the point is that even if 99% space civilizations preferred peaceful cooperation, it wouldn’t change the outcome—expose yourself, and someone from the remaining 1% will pull the trigger. (And in later books you learn that it’s actually even worse.) This is the part that I call Moloch.
There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources.
Not relevant, if you succeed in hiding you simply fall off the vulnerability landscape. We only need to consider what happens when you’ve been exposed. Also, whose resources? It’s a cosmic commons, so who cares if it gets destroyed.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason.
That’s just Type-1 vulnerable world. No need for the contrived argumentation the author gave.
There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat.
Not really, cleaning up extinction threats is a public good that falls prey to the Tragedy of the Commons. Even if you made the numbers work out somehow—which is very difficult and requires certain conditions that the author has explicitly refuted (like the impossibility to colonize other stars or to send out spam messages) - it would still not be an example of Moloch. It would be an example of pan-galactic coordination, albeit a perverted one.
Epistemic status: Haven’t read the book, so take it with some piles of salt.
Everything I heard about the Three Body Problem gets the sociology wrong, and seems to fail to model what humans actually do in crises. At least what I heard so far, is that it really reinforces the “humans fall into despair when faced with crises” when that’s really the opposite of what we know happens in real humanitarian crises. People usually substantially increase the amount of work they do, and generally report higher levels of engagement and very rarely just give up.
See also this pretty extended critique of the Three Body Problem by Jacobian: https://putanumonit.com/2018/01/07/scientist-fiction/
I think part of the value of Three Body is how foreign and yet how viscerally plausible the sociology is (even if you think that it gets things wrong, it still feels like something that could happen). Though the series mostly takes place in the future, it begins in the real, historical Cultural Revolution. It seems to me as if the whole sociology of the book is an extrapolation from what happened during the Cultural Revolution – a situation that saw large groups of humans pushed to their breaking point, and yet displayed failure modes very different from what Westerners are accustomed to.
One scene that stands out to me in this regard is when a whole population is trying to flee, but there’s only one ship that has the technology needed to get away in time. When the people realize this, they all try to shoot the ship down. To me this seems very much a collectivist thing to me – someone from an individualist culture might focus on people wanting to escape on that ship themselves; but the people in Three Body just want to equalize things by pulling the outlier down with them. Relatedly, I think that the reason the series as a whole feels so focused on sociology is that it is entirely about groups of people and never really about individuals at all. (Sure there are individual characters, but they mostly just function as vehicles for the story and/or ways to explore morality.)
Overall I disagree with Robin Hanson that Liu ‘gets away with things’ just because he’s Chinese. I think a large part of the value of his books is that they’re entirely Chinese in their viewpoint. While I believe in human universals, I think Westerners generally underestimate how differently from them many Chinese people think. Chinese people and Westerners may react very differently to situations, and that might explain much of why the sociology feels so wrong to readers like Robin Hanson. In any case it’s certainly fascinating and you should read it.
I have read the trilogy and I second habryka’s comment. One non-too-spoily example is that at several points, humanity is seen as a cultural monolith: all of humanity is described as having a single reaction to an event, no variation between countries or at least cultural blocks.
(I greatly enjoyed the books and would recommend them to anyone who likes SF despite the criticism above.)
ah yeah, that example is just in line with Robin Hanson’s complaint that those stories are biased from thinking in a far view mode
In short term, sure. In long term? I look around me and see people so tired of taking precautions against COVID-19 that they would rather die than spend another day wearing a face mask.
In the book, the time intervals were much longer, given the distances in universe, and the speed of light. People were capable of dramatic decisions when the threat was detected. A few years later, with threat still on the way, they were already burned out. Sounds realistic to me.
And the “cosmic sociology” is Meditations on Moloch turned up to eleven.
Where exactly do you see Moloch in the books? It’s quite the opposite if anything; the mature civilizations of the universe have coordinated around cleaning up the cosmos of nascent civilizations, somehow, without a clear coordination mechanism. Or perhaps it’s a Type-1 vulnerable world, but it doesn’t fit well with the author’s argumentation. I’m not sure, and I’m not sure the author knows either.
I’m still a little puzzled by all the praises for the deep game theoretic insights the book series supposedly contains though. Maybe game theory as attire?
There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources. (This doesn’t have a parallel on our planet; you cannot really build a hidden developed country.) Ignoring this part, it is a combination of Type-1 (cheap nukes) and Type-2A (first-strike advantage): once you know the position of your target, you can anonymously send the “nukes” to eliminate them.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason. There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat. But the point is that even if 99% space civilizations preferred peaceful cooperation, it wouldn’t change the outcome—expose yourself, and someone from the remaining 1% will pull the trigger. (And in later books you learn that it’s actually even worse.) This is the part that I call Moloch.
Not relevant, if you succeed in hiding you simply fall off the vulnerability landscape. We only need to consider what happens when you’ve been exposed. Also, whose resources? It’s a cosmic commons, so who cares if it gets destroyed.
That’s just Type-1 vulnerable world. No need for the contrived argumentation the author gave.
Not really, cleaning up extinction threats is a public good that falls prey to the Tragedy of the Commons. Even if you made the numbers work out somehow—which is very difficult and requires certain conditions that the author has explicitly refuted (like the impossibility to colonize other stars or to send out spam messages) - it would still not be an example of Moloch. It would be an example of pan-galactic coordination, albeit a perverted one.