Thank you so much for the reply! Simply tracing down the ‘berserker hypothesis’ and ‘great filter’ puts me in touch with thinking on this subject that I was not aware of.
What I thought might be novel about what I wrote included the idea that independent evolution of traits was evidence that life should progress to intelligence a great deal of the time.
When we look at the “great filter” possibilities, I am surprised that so many people think that our society’s self-destruction is such a likely candidate. Intuitively, if there are thousands of societies, one would expect a high variability in social and political structures and outcomes. The next idea I read, that “no rational civilization would launch von Neuman probes” seems extremely unlikely because of that same variability. Where there would be far less variability is mundane constraints of energy and engineering to launch self-replicating spacecraft in a robust fashion. Problems there could easily stop every single one of our thousand candidate civilizations cold, with no variability.
Yes, the current speculations in this field are of wildly varying quality. The argument about convergent evolution is sound.
Minor quibble about convergent evolution which doesn’t change the conclusion much about there being other intelligent systems out there.
All organisms on Earth share some common points (though there might be shadow biospheres), like similar environmental conditions (a rocky planet with a moon, a certain span of temperatures, etc.), a certain biochemical basis (proteins, nucleic acids, water as a solvent, etc.). I’d distinguish convergent evolution within the same system of life on the one hand, and convergent evolution in different systems of life on the other. We have observed the first, and they both likely overlap, but some traits may not be as universal as we’d be lead to think.
For instance, eyes may be pretty useful here, but deep in the oceans of a world like Europa, provided life is possible there, they might not (an instance of the environment conditioning what is likely to evolve).
Thank you so much for the reply! Simply tracing down the ‘berserker hypothesis’ and ‘great filter’ puts me in touch with thinking on this subject that I was not aware of.
What I thought might be novel about what I wrote included the idea that independent evolution of traits was evidence that life should progress to intelligence a great deal of the time.
When we look at the “great filter” possibilities, I am surprised that so many people think that our society’s self-destruction is such a likely candidate. Intuitively, if there are thousands of societies, one would expect a high variability in social and political structures and outcomes. The next idea I read, that “no rational civilization would launch von Neuman probes” seems extremely unlikely because of that same variability. Where there would be far less variability is mundane constraints of energy and engineering to launch self-replicating spacecraft in a robust fashion. Problems there could easily stop every single one of our thousand candidate civilizations cold, with no variability.
Yes, the current speculations in this field are of wildly varying quality. The argument about convergent evolution is sound.
Minor quibble about convergent evolution which doesn’t change the conclusion much about there being other intelligent systems out there.
All organisms on Earth share some common points (though there might be shadow biospheres), like similar environmental conditions (a rocky planet with a moon, a certain span of temperatures, etc.), a certain biochemical basis (proteins, nucleic acids, water as a solvent, etc.). I’d distinguish convergent evolution within the same system of life on the one hand, and convergent evolution in different systems of life on the other. We have observed the first, and they both likely overlap, but some traits may not be as universal as we’d be lead to think.
For instance, eyes may be pretty useful here, but deep in the oceans of a world like Europa, provided life is possible there, they might not (an instance of the environment conditioning what is likely to evolve).