...most plausible future scenarios. (?) I would take out “or most”.
Shortly thereafter, we may see an “intelligence explosion” or “technological Singularity” — a chain of events by which human-level AI leads, fairly rapidly, to intelligent systems whose capabilities far surpass those of biological humanity as a whole (Chalmers 2010)...Finally, we discuss the possible consequences of an intelligence explosion and which actions we can take now to influence those results.
Is the idea of a “technological Singularity” different than a combination of predictions about technology and predictions about its social and political effects? An intelligence explosion could be followed by little changing, if for example all human created AIs tended to become the equivalent of ascetic monks. That being so, I would start with the technological claims and make them the focus by not emphasizing the “Singularity” aspect, a Singularity being a situation after which the future will be very different than before.
...most plausible future scenarios. (?) I would take out “or most”.
Is the idea of a “technological Singularity” different than a combination of predictions about technology and predictions about its social and political effects? An intelligence explosion could be followed by little changing, if for example all human created AIs tended to become the equivalent of ascetic monks. That being so, I would start with the technological claims and make them the focus by not emphasizing the “Singularity” aspect, a Singularity being a situation after which the future will be very different than before.