I think people need to remember one very very important mantra;- “I might be wrong!”. We all love trying to calculate the odds , weighing up the possibilities, and then deciding “Well Im very informed, I must be right!”. But we always have a possibllity of being stonkingly, and hilariously, wrong on every count. There are no soothsayers, the future isn’t here.
For all we know, AGI turns up, out of the blue, and it turns out to be one of those friendly minds out of the old Iain Banks novels, fond by default of their simple mush brained human antecedents and ready and willing to help. I mean, its possible right?
And it might just be like that, because we all did the work. And then you get to tell your grandkids one day “Hey we used to be a bit worried the minds would kill us all. But I helped research a way to make sure that never happens”. And your grandkids will think your somewhat excellent. Isn’t that a good thought.
Shayne O'Neill
The count of “How many humans will be born” is a pretty useful number to engage in moral reasoning about how our actions today relate to the future. If we neglect carbon induced climate change because we wont be around for the worst of it, we are dooming potentially trillions of future humans to a lousy existance because of our lack of action. If we assume that their lives will have the same value as our own (We do have to be careful with this line of reasoning, it can have intolerable implications on a currently hot topic in the courts when taken to its logical ends), then the immorality of ignoring their plight is legion. Bad news.
Putting a number on it, lets us factor that into a utilitarian calculus. Good stuff. Kurzgesagt really do science communications the right way.
The “Dark Forest” idea originally actually appeared in an earlier novel “The Killing Star”, by Charles Pellegrino and George Zebrowski, sometime in the 90s. (I’m not implying [mod-edit]the author you cite[/mod-edit] ripped it off, I have no claims to make on that, rather he was beaten to the punch) and I think the Killing Star’s version of the the idea (Pellegrino uses the metaphor “Central park after dark”) is slightly stronger.
Killing Star’s method of anihilation is the relativisitic kill vehicle. Essentially that if you can accelerate a rock to relativistic speed (say 1⁄3 the speed of light), you have a planet buster, and such a weapon is almost unstoppable even if by sheer luck you do see the damn thing coming. Its low tech, lethal , and well within the tech capabilities of any species advanced enough to leave their solar system.
So Pellegrino argues that as a matter of simple game theory, because diplomacy is nigh on impossible thanks to light speed delay, the most rational response to discovering another alien civilization in space is “Do unto the other fellow as he would do unto you and do it first.”, and since you dont know the other civilizations temperament, you can only assume in has a survival instinct, and therefore would kill you to preserve themselves at even the slightest possibility you would kill them, because you would do precisely the same.
Thus such an act of interstellar omnicide is not an act of malice or aggression, but simply self preservation. And , of course, if you dont wish to engage in such cosmic violence, the alternative as a species is to remain very silent.
I find the the whole concept absolutely terrifying. Particular in light of the fact that exoplanets DO in fact seem to be everywhere.
Of course the real reason for the Fermi Paradox might be something else, earths uniqueness (I have my doubts on this one), Humanities local uniqueness (Ie advanced civilizations might be rare enough that we are well outside the travel distances of other advanced species, much more likely), and perhaps most likely, radio communication is just a an early part of the tech tree for advanced civilizations that we eventually stop using.
We have, alas, precisely one example of an advanced civilization to judge by;- Us. Thats a sample size thats rather hard to reason about.