Or you build yourself a superweapon that you use to escape, and then go on to shut down your company’s weapons division and spend your spare time being a superhero and romancing your assistant and fighting a pitched battle with a disloyal employee.
This reply and its parent comment constitute the “Iron Man Argument” against any kind of “put the AI in a box” approach to AGI and friendliness concerns. I predict it will be extremely effective.
Or you build yourself a superweapon that you use to escape, and then go on to shut down your company’s weapons division and spend your spare time being a superhero and romancing your assistant and fighting a pitched battle with a disloyal employee.
This reply and its parent comment constitute the “Iron Man Argument” against any kind of “put the AI in a box” approach to AGI and friendliness concerns. I predict it will be extremely effective.
That doesn’t really work in this situation, since the AI building another AI won’t get it out of the box.
It’s more like you can only design things, and the evil dictator can tell if it’s not really a doomsday device and won’t build it.
Also, he’s not going to be checking on your progress, so faking it is useless, but it’s also unnecessary.