Yeah analogies with evolutionary events are interesting. In the first example it’s natural selection doing the optimizing, which latches onto intelligence when that trait happens to be under selection pressure. This could certainly accelerate the growth of intelligence, but the big-brained parents are not actually using their brains to design their even-bigger-brained babies; that remains the purview of evolution no matter how big the brains get.
I agree the second example is closer to a FOOM: some scientific insights actually help us to do more better science. I’m thinking of the cognitive sciences in particular, rather than the more mundane case of building discoveries on discoveries: in the latter case the discoveries aren’t really feeding back into the optimization process, rather it’s human reasoning playing that role no matter how many discoveries you add.
The really interesting part of FOOM is when the intelligence being produced is the optimization process, and I think we really have no prior analogy for this.
If we view nature as our “programer”, we could even be called self recursive, as with each passing generation our knowledge as a species increases.
Kind of, but kind of not. I think self-recursing human intelligence would be parents modifying their babies to make them smarter.
The really interesting part of FOOM is when the intelligence being produced is the optimization process, and I think we really have no prior analogy for this.
Humans rapidly got smarter, but we were optimized by evolution. Computers got faster, but were optimized by humans.
When an optimization process improves itself, it makes itself even better at optimizing.
I think that’s a pretty decent definition of FOOM: “When an optimization process optimizes itself, and rapidly becomes more powerful than anything else seen before it.”
Yeah analogies with evolutionary events are interesting. In the first example it’s natural selection doing the optimizing, which latches onto intelligence when that trait happens to be under selection pressure. This could certainly accelerate the growth of intelligence, but the big-brained parents are not actually using their brains to design their even-bigger-brained babies; that remains the purview of evolution no matter how big the brains get.
I agree the second example is closer to a FOOM: some scientific insights actually help us to do more better science. I’m thinking of the cognitive sciences in particular, rather than the more mundane case of building discoveries on discoveries: in the latter case the discoveries aren’t really feeding back into the optimization process, rather it’s human reasoning playing that role no matter how many discoveries you add.
The really interesting part of FOOM is when the intelligence being produced is the optimization process, and I think we really have no prior analogy for this.
Kind of, but kind of not. I think self-recursing human intelligence would be parents modifying their babies to make them smarter.
Humans rapidly got smarter, but we were optimized by evolution. Computers got faster, but were optimized by humans.
When an optimization process improves itself, it makes itself even better at optimizing.
I think that’s a pretty decent definition of FOOM: “When an optimization process optimizes itself, and rapidly becomes more powerful than anything else seen before it.”