Not sure if you’re referring to the same literature, but I note a great divergence between peak oil advocates and singularitarians. This is a little weird, if you think of Aumann’s Agreement theorem.
Both groups are highly populated with engineer types, highly interested in cognitive biases, group dynamics, habits of individuals and societies and neither are mainstream.
Both groups use extrapolation of curves from very real phenomena. In the case of the kurzweillian singularitarians, it is computing power and in the case of the peak oil advocates, it is the hubbert curve for resources along with solid Net Energy based arguments about how civilization should decline.
The extreme among the Peak Oil advocates are collapsitarians and believe that people should drastically change their lifestyles, if they want to survive. They are also not waiting for the others to join them and many are preparing to go to small towns, villages etc. The oildrum, linked here had started as a moderate peak oil site discussing all possibilities, nowadays, apparently, its all doom all the time.
The extreme among the singularitarians have been asked no such sacrifice, just to give enough money and support to make sure that Friendly AI is achieved first.
Both groups believe that business as usual cannot go on for too long, but they expect dramatically different consequences. The singularitarians assert that economics conditions and technology will improve until a nonchalant super-intelligence will be created and wipe out humanity. The collapsitarians believe that economic conditions will worsen, civilization is not built robustly and will collapse badly with humanity probably going extinct or only the last hunter gatherers surviving.
It should be possible to believe both—unless you’re expecting peak oil to lead to social collapse fairly soon, Moore’s law could make a singluarity possible while energy becomes more expensive.
Which could suggest a distressing pinch point: not wanting to delay AI too long in case we run out of energy for it to use; not wanting to make an AI too soon in case it’s Unfriendly.
Not sure if you’re referring to the same literature, but I note a great divergence between peak oil advocates and singularitarians. This is a little weird, if you think of Aumann’s Agreement theorem.
Both groups are highly populated with engineer types, highly interested in cognitive biases, group dynamics, habits of individuals and societies and neither are mainstream.
Both groups use extrapolation of curves from very real phenomena. In the case of the kurzweillian singularitarians, it is computing power and in the case of the peak oil advocates, it is the hubbert curve for resources along with solid Net Energy based arguments about how civilization should decline.
The extreme among the Peak Oil advocates are collapsitarians and believe that people should drastically change their lifestyles, if they want to survive. They are also not waiting for the others to join them and many are preparing to go to small towns, villages etc. The oildrum, linked here had started as a moderate peak oil site discussing all possibilities, nowadays, apparently, its all doom all the time.
The extreme among the singularitarians have been asked no such sacrifice, just to give enough money and support to make sure that Friendly AI is achieved first.
Both groups believe that business as usual cannot go on for too long, but they expect dramatically different consequences. The singularitarians assert that economics conditions and technology will improve until a nonchalant super-intelligence will be created and wipe out humanity. The collapsitarians believe that economic conditions will worsen, civilization is not built robustly and will collapse badly with humanity probably going extinct or only the last hunter gatherers surviving.
It should be possible to believe both—unless you’re expecting peak oil to lead to social collapse fairly soon, Moore’s law could make a singluarity possible while energy becomes more expensive.
Which could suggest a distressing pinch point: not wanting to delay AI too long in case we run out of energy for it to use; not wanting to make an AI too soon in case it’s Unfriendly.