This post is by James Miller, who posted about a year ago that he was writing a book. It’s apparently out now, and seems to have received some endorsements from some recognizable figures. If there’s anyone here who’s read it, how worthwhile of a read would it be for someone already familiar with the idea of the singularity?
I’d recommend it. It’s not exactly Earth-shattering, but it made a number of interesting points which I hadn’t encountered before, such as pointing out that the mere possibility of a Singularity could by itself be an existential risk if people took it seriously enough. For example, if a major nuclear power thought—correctly or incorrectly—that a hostile country was about to build an AI capable of undergoing a hard takeoff, they could use nuclear weapons against the other country to prevent the completion of the AI, even at the risk of also causing World War III in the process. I also thought the discussion of sexbots was interesting, among other things.
This post is by James Miller, who posted about a year ago that he was writing a book. It’s apparently out now, and seems to have received some endorsements from some recognizable figures. If there’s anyone here who’s read it, how worthwhile of a read would it be for someone already familiar with the idea of the singularity?
I’d recommend it. It’s not exactly Earth-shattering, but it made a number of interesting points which I hadn’t encountered before, such as pointing out that the mere possibility of a Singularity could by itself be an existential risk if people took it seriously enough. For example, if a major nuclear power thought—correctly or incorrectly—that a hostile country was about to build an AI capable of undergoing a hard takeoff, they could use nuclear weapons against the other country to prevent the completion of the AI, even at the risk of also causing World War III in the process. I also thought the discussion of sexbots was interesting, among other things.
Hanson’s review.