CellBioGuy all your astrobiology posts are great I’d be happy to read all of those. This may be off the astrobiology topic but I would love to see a post with your opinion on the foom question. For example do you agree with Gwern’s post about there not being complexity limitations preventing runaway self-improving agents?
I generally have very low confidence in singulatarian ideas of any stripe, ‘foom’ or non. Partially for sociological analysis-of-the-origin-of-singulatarian-and-related-ideas reasons. Partially for astrobiological reasons relating to the fact that nothing has ever consumed a star system or sent self replicating anythings between stars and my impression of the range of possible outcomes of intelligent living things that are not extinction or controlling the universe and the possible frequencies of things something like us. Partially because I think that many people everywhere misattribute the causes of recent changes to the world and where they are going and have short time horizons. Partially because I am pretty sure that diminishing returns applies to absolutely everything in this world aside from black hole growth.
I can’t say I’ve read Gwern’s analysis of computational complexity, but I do note that in the messy complicated poorly-sampled real world you can very very seldom actually KNOW enough to predict much of a lot of types of events with great precision.
CellBioGuy all your astrobiology posts are great I’d be happy to read all of those. This may be off the astrobiology topic but I would love to see a post with your opinion on the foom question. For example do you agree with Gwern’s post about there not being complexity limitations preventing runaway self-improving agents?
I generally have very low confidence in singulatarian ideas of any stripe, ‘foom’ or non. Partially for sociological analysis-of-the-origin-of-singulatarian-and-related-ideas reasons. Partially for astrobiological reasons relating to the fact that nothing has ever consumed a star system or sent self replicating anythings between stars and my impression of the range of possible outcomes of intelligent living things that are not extinction or controlling the universe and the possible frequencies of things something like us. Partially because I think that many people everywhere misattribute the causes of recent changes to the world and where they are going and have short time horizons. Partially because I am pretty sure that diminishing returns applies to absolutely everything in this world aside from black hole growth.
I can’t say I’ve read Gwern’s analysis of computational complexity, but I do note that in the messy complicated poorly-sampled real world you can very very seldom actually KNOW enough to predict much of a lot of types of events with great precision.