I have heard discussion about the singularity on the web but I have never had any idea at all what it is, so I can’t say much about that.
You could contact Anna Salamon or Carl Shulman for a well-written introductory piece on the singularity.
Very short summary: if we humans manage to scientifically understand intelligence, then the consequences would be counter-intuitively extreme. The counter-intuitiveness comes from the fact that humans struggle to see our own intelligence in perspective:
both how extreme and sudden its effects have been on the biosphere,
and the fact that it is not the best possible form of intelligence, not the final word, more the like a messy first attempt
If one accepts that intelligence is a naturalistic property of computational systems, then it becomes clear that the range of possible kinds or levels of intelligence probably extends both to much narrower and dumber systems than humans and to much more able, general systems.
You could contact Anna Salamon or Carl Shulman for a well-written introductory piece on the singularity.
Very short summary: if we humans manage to scientifically understand intelligence, then the consequences would be counter-intuitively extreme. The counter-intuitiveness comes from the fact that humans struggle to see our own intelligence in perspective:
both how extreme and sudden its effects have been on the biosphere,
and the fact that it is not the best possible form of intelligence, not the final word, more the like a messy first attempt
If one accepts that intelligence is a naturalistic property of computational systems, then it becomes clear that the range of possible kinds or levels of intelligence probably extends both to much narrower and dumber systems than humans and to much more able, general systems.