Nice overview, I agree but I think the 2016-2021 plan could still arguably be described as “obtain god-like AI and use it to take over the world”(admittedly with some rhetorical exaggeration, but like, not that much)
I think it’s pretty important that the 2016 to 2021 plan was explicitly aiming to avoid unleashing godlike power. “The minimal amount of power to do a thing which is otherwise impossible”, not “as much omnipotence as is allowed by physics”.
And similarly, the 2016 to 2021 plan did not entail optimizing the world except with regard to what is necessary to prevent dangerous AGIs.
These are both in contrast to the earlier 2004 to 2016 plan. So the rhetorical exaggeration confuses things.
MIRI actually did have a plan that, in my view, is well characterized as (eventually) taking over the world, without exaggeration, that’s apt to get lost if we describe a “toned down” plan as “taking over the world”, because it involves taking powerful, potentially aggressive, action.
This discussion is a nice illustration of why x-riskers are definitely more power-seeking than the average activist group. Just like Eskimos proverbially have 50 words for snow, AI-risk-reducers need at least 50 terms for “taking over the world” to demarcate the range of possible scenarios. ;)
Nice overview, I agree but I think the 2016-2021 plan could still arguably be described as “obtain god-like AI and use it to take over the world”(admittedly with some rhetorical exaggeration, but like, not that much)
I think it’s pretty important that the 2016 to 2021 plan was explicitly aiming to avoid unleashing godlike power. “The minimal amount of power to do a thing which is otherwise impossible”, not “as much omnipotence as is allowed by physics”.
And similarly, the 2016 to 2021 plan did not entail optimizing the world except with regard to what is necessary to prevent dangerous AGIs.
These are both in contrast to the earlier 2004 to 2016 plan. So the rhetorical exaggeration confuses things.
MIRI actually did have a plan that, in my view, is well characterized as (eventually) taking over the world, without exaggeration, that’s apt to get lost if we describe a “toned down” plan as “taking over the world”, because it involves taking powerful, potentially aggressive, action.
This discussion is a nice illustration of why x-riskers are definitely more power-seeking than the average activist group. Just like Eskimos proverbially have 50 words for snow, AI-risk-reducers need at least 50 terms for “taking over the world” to demarcate the range of possible scenarios. ;)