The survey gives a 33% chance of an “extremely bad outcome” from the development of machine intelligence.
Another of their surveys gave a 5% chance of human extinction at the hands of superintelligence by 2100.
These figures may need to be “adjusted”—on the grounds that the FHI seems to be a bit of a doom-mongering organisation for whom the end of the world is a fund-raising and marketing tool, and the surveys sample from their friends and associates.
The survey gives a 33% chance of an “extremely bad outcome” from the development of machine intelligence.
Another of their surveys gave a 5% chance of human extinction at the hands of superintelligence by 2100.
These figures may need to be “adjusted”—on the grounds that the FHI seems to be a bit of a doom-mongering organisation for whom the end of the world is a fund-raising and marketing tool, and the surveys sample from their friends and associates.