Can someone recommend good Russian learning material? Preferably something that could be found online (books count).
C9AEA3E1
Yes, the current speculations in this field are of wildly varying quality. The argument about convergent evolution is sound.
Minor quibble about convergent evolution which doesn’t change the conclusion much about there being other intelligent systems out there.
All organisms on Earth share some common points (though there might be shadow biospheres), like similar environmental conditions (a rocky planet with a moon, a certain span of temperatures, etc.), a certain biochemical basis (proteins, nucleic acids, water as a solvent, etc.). I’d distinguish convergent evolution within the same system of life on the one hand, and convergent evolution in different systems of life on the other. We have observed the first, and they both likely overlap, but some traits may not be as universal as we’d be lead to think.
For instance, eyes may be pretty useful here, but deep in the oceans of a world like Europa, provided life is possible there, they might not (an instance of the environment conditioning what is likely to evolve).
To the best of my knowledge, there is nothing quite like SIAI or lesswrong in continental western Europe. People aren’t into AI as much as in the US, and if there’s rationality thinking being done, it’s mostly traditional rationality, skepticism, etc.
Atheism can score high in many countries, as a rule of thumb countries to the north are more atheistic, those to the south (Spain, Portugal, Italy, etc.) are more religious.
There are a few scattered transhumanist as well as a few life-extension organizations, which are loosely starting to cooperate together.
The European commission itself started prioritizing small-scale healthy life extension a year or two ago. This could help focus more people on such questions in the years to come.
Likely, few people read it, maybe just one voted, and that’s just one, potentially biased opinion. The score isn’t significant.
I don’t see anything particularly wrong with your post. Its sustaining ideas seems similar to the Fermi paradox, and the berserker hypothesis. From which you derive that a great filter lies ahead of us, right?
Our bodies need to perform different roles as we age and mature. We’d also need different sets of skills depending on our current developmental phase. It would make sense for our brains to change too, that the developmental path of our brain is planned to make it undergo changes that’d make it more adapted to the tasks it’ll have to tackle over different developmental phases.
It’d make sense for our brain to be more fine tuned for grabbing resources from family when we’re a kid, to grow as fast as possible, then better tuned to search for sexual partners once we’re getting mature, and lastly, more fine tuned to take care of our kids once we got them.
And if there’s a mechanism which makes our brain undergo developmental changes along a pre-planned path, then we might also expect that past the age at which we reproduce, there’d be less and less evolutionary pressure to shape that developmental trajectory.
I don’t think either that evolution would have much of a reason to cleanly engineer a stable end-state after which development just entirely stops, and leaves you with a well-adjusted, perfectly functional body or brain. That may not be a trivial task after all.
Seems similar enough to “Every part of your brain assumes that all the other surrounding parts work a certain way. The present brain is the Environment of Evolutionary Adaptedness for every individual piece of the present brain.
Start modifying the pieces in ways that seem like “good ideas”—making the frontal cortex larger, for example—and you start operating outside the ancestral box of parameter ranges. And then everything goes to hell.
So you’ll forgive me if I am somewhat annoyed with people who run around saying, “I’d like to be a hundred times as smart!” as if it were as simple as scaling up a hundred times instead of requiring a whole new cognitive architecture.”
Eliezer Yudkowsky, Growing Up is Hard