The way I look at things, an AGI fooms straight from 1 to 2. At that point it has subdued all competing intelligences and can take it’s time getting to 3. I don’t think 2 can plausibly be boxed.
Designing nuclear weapons isn’t any use. The limiting factor in manufacturing nuclear weapons is uranium and industrial capacity, not technical know-how. That (I presume) is why Eliezer cares about nanobots. Self-replicating nanobots can plausibly create a greater power differential at a lower physical capital investment.
Do I think that the simplest AI capable of taking over the world (for practical purposes) can’t be boxed if it doesn’t want to be boxed? I’m not sure. I think that is a slightly different from whether an AI fooms straight from 1 to 2. I think there are many different powerful AI designs. I predict some of them can be boxed. Also, I don’t know how good you are at taking over the world. Some people need to inherit an empire. Around 1200, one guy did it with like a single horse.
The 1940′s would like to remind you that one does not need nanobots to refine uranium.
I’m pretty sure if I had $1 trillion and a functional design for a nuclear ICBM I could work out how to take over the world without any further help from the AI.
If you agree that:
it is possible to build a boxed AI that allows you to take over the world
taking over the world is a pivotal act
then maybe we should just do that instead of building a much more dangerous AI that designs nanobots and unboxes itself? (assuming of course you accept Yudkowski’s “pivotal-act framework of course).
The 1940′s would like to remind you that one does not need nanobots to refine uranium.
I’m confused. Nobody has ever used nanobots to refine uranium.
I’m pretty sure if I had $1 trillion and a functional design for a nuclear ICBM I could work out how to take over the world without any further help from the AI.
Really? How would you do it? The Supreme Leader of North Korea has basically those resources and has utterly failed to conquer South Korea, much less the whole world. Israel and Iran are in similar situations and they’re mere regional powers.
The way I look at things, an AGI fooms straight from 1 to 2. At that point it has subdued all competing intelligences and can take it’s time getting to 3. I don’t think 2 can plausibly be boxed.
You don’t think the simplest AI capable of taking over the world can be boxed?
What if I build an AI and the only 2 things it is trained to do are:
pick stocks
design nuclear weapons
Is your belief that: a) this AI would not allow me to take over the world or b) this AI could not be boxed ?
Designing nuclear weapons isn’t any use. The limiting factor in manufacturing nuclear weapons is uranium and industrial capacity, not technical know-how. That (I presume) is why Eliezer cares about nanobots. Self-replicating nanobots can plausibly create a greater power differential at a lower physical capital investment.
Do I think that the simplest AI capable of taking over the world (for practical purposes) can’t be boxed if it doesn’t want to be boxed? I’m not sure. I think that is a slightly different from whether an AI fooms straight from 1 to 2. I think there are many different powerful AI designs. I predict some of them can be boxed. Also, I don’t know how good you are at taking over the world. Some people need to inherit an empire. Around 1200, one guy did it with like a single horse.
The 1940′s would like to remind you that one does not need nanobots to refine uranium.
I’m pretty sure if I had $1 trillion and a functional design for a nuclear ICBM I could work out how to take over the world without any further help from the AI.
If you agree that:
it is possible to build a boxed AI that allows you to take over the world
taking over the world is a pivotal act
then maybe we should just do that instead of building a much more dangerous AI that designs nanobots and unboxes itself? (assuming of course you accept Yudkowski’s “pivotal-act framework of course).
I’m confused. Nobody has ever used nanobots to refine uranium.
Really? How would you do it? The Supreme Leader of North Korea has basically those resources and has utterly failed to conquer South Korea, much less the whole world. Israel and Iran are in similar situations and they’re mere regional powers.