Robin Hanson says that those stories are rare.
Most discussion of the non-immediate future seems to take place in science fiction, and the small subset of science fiction where authors try hard to remain realistic is called hard science fiction. Loosely associated with hard science fiction is an intellectual community of people who try to make projections which are true to our best understanding of the world. They work out the broad science and engineering of plausible future space colonies, starships, virtual reality, computer networks, survellience, software assistants, genetically engineered people, tiny machines made to atomic accuracy, and much more.
Unfortunately, few if any people of these people know much social science. So their projections often combine reasonable physics or computers with laughable economic assumptions. This often seriously compromises their ability to make useful projections. (source: The Economics of Science Fiction)
This is different from social science fiction, which is deliberate exploration of other possible forms of sociological organisation.
Contagion is a hard science fiction story about a world stopping virus, and it got every single social phenomena correct.
Huh, I hadn’t even thought to consider contagion as science fiction (by the time I watched it was just called “reality”)
That’s a pretty extreme over-dramatization. Corona isn’t even 1% as bad.
Huh? My recollection was that it didn’t model any of the disinformation campaigns or denial of its threat by governments. My vague recollection (from like 8 years ago) is that it was a series of vignettes of people caring a lot about not getting infected and being careful. Again, not very representative.
Edit: Fair enough, seems I misremembered.
I watched it beat-for-beat and it had tons of pretty much spot on “yup, that’s the way it went down”.
If the Coronavirus had a 30% fatality rate people would care a lot about not getting infected in the real world, too.
That movie had been recommended to me several times, but I never got around to watch it; I guess if there’s a moment to do so, it’d be now.
I recently read a book called Artemis, which is very hard sci-fi as far as I can tell. It goes into great detail describing the economics supporting the moon base where the story takes place—how much money it makes from tourism vs. exporting rocket fuel, how it got started by a corporation under Kenyan flag of convenience, etc. I don’t know enough economics to judge whether it succeeded but it certainly tried to get the economics right, and seemed plausible enough to me.
I bet I know why you read that book...
I’d be interested to see how much of it you can guess. :)
My sole reason is that it’s the name of your daughter :-)
(Secondarily I imagine “you heard it was good”, maybe it’s an interesting sci-fi story.)
Correct on the first count, not the second—someone gave it to me as a gift because they saw it had the same name as my daughter.
The Mote in God’s Eye is a pretty good example of social science fiction in addition to being a great science fiction novel in general.
The Three Body Problem (trilogy) by Liu Cixin is a masterful work that combines thoughtful hard science (the author is an engineer by training) with a keen political and sociological eye. Sociology is actually a core theme of the book, and the author goes so far as to define a “cosmic sociology” (I’m probably forgetting the actually term from the book, it’s been a while since I finished reading). I can’t recommend this author enough. The story is enthralling, the scientific and sociological predictions are delightful, surprising, and strikingly intuitive at the same time. I wouldn’t place it is is merely a great sci-fi epic; it’s truly an unparalleled work of literature and imagination.
Epistemic status: Haven’t read the book, so take it with some piles of salt.
Everything I heard about the Three Body Problem gets the sociology wrong, and seems to fail to model what humans actually do in crises. At least what I heard so far, is that it really reinforces the “humans fall into despair when faced with crises” when that’s really the opposite of what we know happens in real humanitarian crises. People usually substantially increase the amount of work they do, and generally report higher levels of engagement and very rarely just give up.
See also this pretty extended critique of the Three Body Problem by Jacobian: https://putanumonit.com/2018/01/07/scientist-fiction/
I think part of the value of Three Body is how foreign and yet how viscerally plausible the sociology is (even if you think that it gets things wrong, it still feels like something that could happen). Though the series mostly takes place in the future, it begins in the real, historical Cultural Revolution. It seems to me as if the whole sociology of the book is an extrapolation from what happened during the Cultural Revolution – a situation that saw large groups of humans pushed to their breaking point, and yet displayed failure modes very different from what Westerners are accustomed to.
One scene that stands out to me in this regard is when a whole population is trying to flee, but there’s only one ship that has the technology needed to get away in time. When the people realize this, they all try to shoot the ship down. To me this seems very much a collectivist thing to me – someone from an individualist culture might focus on people wanting to escape on that ship themselves; but the people in Three Body just want to equalize things by pulling the outlier down with them. Relatedly, I think that the reason the series as a whole feels so focused on sociology is that it is entirely about groups of people and never really about individuals at all. (Sure there are individual characters, but they mostly just function as vehicles for the story and/or ways to explore morality.)
Overall I disagree with Robin Hanson that Liu ‘gets away with things’ just because he’s Chinese. I think a large part of the value of his books is that they’re entirely Chinese in their viewpoint. While I believe in human universals, I think Westerners generally underestimate how differently from them many Chinese people think. Chinese people and Westerners may react very differently to situations, and that might explain much of why the sociology feels so wrong to readers like Robin Hanson. In any case it’s certainly fascinating and you should read it.
I have read the trilogy and I second habryka’s comment. One non-too-spoily example is that at several points, humanity is seen as a cultural monolith: all of humanity is described as having a single reaction to an event, no variation between countries or at least cultural blocks.
(I greatly enjoyed the books and would recommend them to anyone who likes SF despite the criticism above.)
ah yeah, that example is just in line with Robin Hanson’s complaint that those stories are biased from thinking in a far view mode
In short term, sure. In long term? I look around me and see people so tired of taking precautions against COVID-19 that they would rather die than spend another day wearing a face mask.
In the book, the time intervals were much longer, given the distances in universe, and the speed of light. People were capable of dramatic decisions when the threat was detected. A few years later, with threat still on the way, they were already burned out. Sounds realistic to me.
And the “cosmic sociology” is Meditations on Moloch turned up to eleven.
Where exactly do you see Moloch in the books? It’s quite the opposite if anything; the mature civilizations of the universe have coordinated around cleaning up the cosmos of nascent civilizations, somehow, without a clear coordination mechanism. Or perhaps it’s a Type-1 vulnerable world, but it doesn’t fit well with the author’s argumentation. I’m not sure, and I’m not sure the author knows either.
I’m still a little puzzled by all the praises for the deep game theoretic insights the book series supposedly contains though. Maybe game theory as attire?
There is no perfect match with Bostrom’s vulnerabilities, because the book assumed there was a relatively safe strategy: hide. If no one knows you are there, no one will attack you, because although the “nukes” are cheap, they would be destroying potentially useful resources. (This doesn’t have a parallel on our planet; you cannot really build a hidden developed country.) Ignoring this part, it is a combination of Type-1 (cheap nukes) and Type-2A (first-strike advantage): once you know the position of your target, you can anonymously send the “nukes” to eliminate them.
The point of the Dark Forest hypothesis was precisely that in a world with such asymmetric weapons, coordination is not necessary. If you naively make yourself visible to thousand potential enemies, it is statistically almost certain that someone will pull the trigger; for whatever reason. There is a selfish reason to pull the trigger; any alien civilization is a potential extinction threat. But the point is that even if 99% space civilizations preferred peaceful cooperation, it wouldn’t change the outcome—expose yourself, and someone from the remaining 1% will pull the trigger. (And in later books you learn that it’s actually even worse.) This is the part that I call Moloch.
Not relevant, if you succeed in hiding you simply fall off the vulnerability landscape. We only need to consider what happens when you’ve been exposed. Also, whose resources? It’s a cosmic commons, so who cares if it gets destroyed.
That’s just Type-1 vulnerable world. No need for the contrived argumentation the author gave.
Not really, cleaning up extinction threats is a public good that falls prey to the Tragedy of the Commons. Even if you made the numbers work out somehow—which is very difficult and requires certain conditions that the author has explicitly refuted (like the impossibility to colonize other stars or to send out spam messages) - it would still not be an example of Moloch. It would be an example of pan-galactic coordination, albeit a perverted one.
Very much disagree. My sense is that the book series is pretty meagre on presenting “thoughtful hard science” as well as game theory and human sociology.
To pick the most obvious example—the title of the trilogy* - the three body problem was misrepresented in the books as “it’s hard to find the general analytic solution” instead of “the end state is extremely sensitive to changes in the initial condition”, and the characters in the book (both humans and Trisolarians) spend eons trying to solve the problem mathematically.
But even if an exact solution was found—which does exist for some chaotic systems like logistic maps—it would have been useless since the initial condition cannot be known perfectly. This isn’t a minor nitpick like the myriad other scientific problems with the Trisolarian system that can be more easily forgiven for artistic license; this is missing what chaotic systems are about. Why even invoke the three-body problem other than as attire?
*not technically the title of the book series, but frequently referred to as such
Well, integrating all our best knowledge of social sciences for SciFi is hard. I am not sure if I can judge if it was successful or not in most cases. What I can point out instead is a couple of works where something like this had been attempted, as the author gave serious thought on how different technology and environment would affect society:
Moon Is A Harsh Mistress by Heinlein
I think the elaboration on how family structure is changed due to low female:male ratio and dangerous environment makes sense ( https://www.reddit.com/r/AskScienceFiction/comments/214z3t/the_moon_is_a_harsh_mistress_how_do_line/ ), and the book contains a lot of good thought on other things like this.
Revelation Space series by Alastair Reynolds
In these books a lot of different social structures are listed that are possible due to technology, such as the Democratic Anarchism (quoting from the wiki):
”The Demarchy functioned by means of a neural implant that constantly sought the user’s opinion on aspects of Demarchist life. This constant prompting eventually faded away into the user’s neural background, much like the ticking of a clock might fade away into background noise.
...
Each core was tasked with collecting and processing votes, and also determining whether or not the elected decision was the best one in previous elections—voters who continually made “good” decisions were rewarded by having their vote count for more than one standard vote.”
The second book (“Chasm City”) also elaborates on how the crew of a small fleet of multi-generational interstellar ships would live and be organized until they reach their destination in the not-so-far future (so without neural links and magic gadgets).
We Are Legion (We Are Bob) by Dennis E. Taylor.
This one is less rigorous on the “hard” part than the previous two. I really like some ideas, such as if human minds can be uploaded and multiplied, we would could end up a whole industry being operated by the copies of a single expert.
Depending on what you define as hard SF, the Nexus trilogy by Ramez Naam might fit. I’ve seen it described as hard SF, as “having some hard scifi elements”, and as “dumbed down science fiction” by a particularly displeased Goodreads user; take your pick. I remember it being very easy to read and making more of an effort than usual to consider the society surrounding the characters. That might not mean getting the social sciences exactly right, though.
In general, I think the closer the time period is to our own, the more likely it is that the social sciences will be right, since the author will have more material to base them on.
It’s been some time since I read one of his books and I’m not sure if he counts as hard SF, but maybe Ian McDonald? I remember being impressed by the complexity of River of Gods. If anyone has read more of his books, please confirm or infirm my guess.
Black Mirror?