AGI will probably be very powerful, and will pursue its goals in the world.L
That’s about 4 assumptions.
1.1 AGI will have goals.
1.2 Its Goals will be in some sort of distinct software module...
1.3...that will be explicitly programmed by humans …
1.4 ..in a dangerously imperfect way.. such that a slight miss as bad as a wide miss.
And then we’ve got
These goals might be anything..
Which is heavily dependent on all of 1.1...1.4.
Moloch.
...is something entirely different , but just as bad. Moloch means undesirable things arising organically from some uncoordinated process, not the failure to be explicit enough about some very explicit process.
This is an immense, technical task that we have not completed
We certainly haven’t figured out how to control a relentless goal
-purser, but we have seen no evidence that such an entity exists, or is even likely.
That’s about 4 assumptions.
1.1 AGI will have goals.
1.2 Its Goals will be in some sort of distinct software module...
1.3...that will be explicitly programmed by humans …
1.4 ..in a dangerously imperfect way.. such that a slight miss as bad as a wide miss.
And then we’ve got
These goals might be anything..
Which is heavily dependent on all of 1.1...1.4.
...is something entirely different , but just as bad. Moloch means undesirable things arising organically from some uncoordinated process, not the failure to be explicit enough about some very explicit process.
We certainly haven’t figured out how to control a relentless goal -purser, but we have seen no evidence that such an entity exists, or is even likely.