Boffins at Cambridge University want to set up a new centre to determine what humankind will do when ultra-intelligent machines like the Terminator or HAL pose “extinction-level” risks to our species.
A philosopher, a scientist and a software engineer are proposing the creation of a Centre for the Study of Existential Risk (CSER) to analyse the ultimate risks to the future of mankind—including bio- and nanotech, extreme climate change, nuclear war and artificial intelligence.
Apart from the frequent portrayal of evil—or just misguidedly deadly—AI in science fiction, actual real scientists have also theorised that super-intelligent machines could be a danger to the human race.
Jaan Tallinn, the former software engineer who was one of the founders of Skype, has campaigned for serious discussion of the ethical and safety aspects of artificial general intelligence (AGI).
Tallinn has said that he sometimes feels he is more likely to die from an AI accident than from cancer or heart disease, CSER co-founder and philosopher Huw Price said. [...]
The source for these stories appears to be a press release from the University of Cambridge:
In 1965, Irving John ‘Jack’ Good sat down and wrote a paper for New Scientist called Speculations concerning the first ultra-intelligent machine. Good, a Cambridge-trained mathematician, Bletchley Park cryptographer, pioneering computer scientist and friend of Alan Turing, wrote that in the near future an ultra-intelligent machine would be built. [...]
Centre for the Study of Existential Risk (CSER) at Cambridge makes headlines.
As of an hour ago, I had not yet heard of the Centre for the Study of Existential Risk.
Luke announced it to Less Wrong, as The University of Cambridge announced it to the world, back in April:
CSER is scheduled to launch next year.
Here is a small selection of CSER press coverage from the last two days:
http://www.bbc.co.uk/news/technology-20501091
http://www.guardian.co.uk/education/shortcuts/2012/nov/26/cambridge-university-terminator-studies
http://www.dailymail.co.uk/news/article-2238152/Cambridge-University-open-Terminator-centre-study-threat-humans-artificial-intelligence.html
http://www.theregister.co.uk/2012/11/26/new_centre_human_extinction_risks/
http://www.slashgear.com/new-ai-think-tank-hopes-to-get-real-on-existential-risk-26258246/
http://www.techradar.com/news/world-of-tech/super-brains-to-guard-against-robot-apocalypse-1115293
http://www.hindustantimes.com/world-news/Europe/Cambridge-to-study-risks-from-robots-at-Terminator-Centre/Article1-964746.aspx
http://economictimes.indiatimes.com/news/news-by-industry/et-cetera/cambridge-to-study-risks-from-robots-at-terminator-centre/articleshow/17372042.cms
http://www.extremetech.com/extreme/141372-judgment-day-update-disneys-grenade-catching-robot-and-the-burger-flipping-robot-that-could-replace-2-million-us-workers
http://slashdot.org/topic/bi/cambridge-university-vs-skynet/
http://www.businessinsider.com/researchers-robots-risk-human-civilization-2012-11
http://www.newscientist.com/article/dn22534-megarisks-that-could-drive-us-to-extinction.html
http://news.cnet.com/8301-11386_3-57553993-76/killer-robots-cambridge-brains-to-assess-ai-risk/
http://www.globalpost.com/dispatches/globalpost-blogs/weird-wide-web/cambridge-university-opens-so-called-termintor-centre-stu
http://www.washingtonpost.com/world/europe/cambridge-university-to-open-center-studying-the-risks-of-technology-to-humans/2012/11/25/e551f4d0-3733-11e2-9258-ac7c78d5c680_story.html
http://www.foxnews.com/tech/2012/11/26/terminator-center-to-open-at-cambridge-university/
Google News: All 119 news sources...
Here’s an excerpt from one quite typical story appearing in tech-tabloid theregister.co.uk today:
The source for these stories appears to be a press release from the University of Cambridge:
Humanity’s last invention and our uncertain future
http://www.cam.ac.uk/research/news/humanitys-last-invention-and-our-uncertain-future/
ThreeFour quick observations:1: That’s a lot of Terminator II photos.
2: FHI at Oxford and the Singularity Institute does not often get this kind of attention.
3: CSER doesn’t appear to have published anything yet.
4: The number of people who have heard the term “existential risk” must have doubled a few times today.