Depends what you mean by generate code. Can it have a prebaked function that copies itself (like computer viruses)? Does it count if it generates programs to attack other systems? If it changes its own source code? Its code stored in memory? You could argue that changing anything in memory is in a certain sense generating code.
If it can’t generate code, it’ll be a 1 shot type of thing. Which means that it must be preprogrammed with the tools to do its job. I can’t come up with any way for it to take control, but it doesn’t seem that hard to come up with some doomsday machine scenarios. E.g. smashing a comet into earth, or making a virus that sterilizes everyone. Or a Shiri’s Scissor could do the trick. The idea being to make something that doesn’t have to learn or improve itself too much.
I was thinking of “something that can understand and write code at the level of a 10x SWE”. I’m further assuming that human designers didn’t give it functions to copy itself or other dumb things.
Depends what you mean by generate code. Can it have a prebaked function that copies itself (like computer viruses)? Does it count if it generates programs to attack other systems? If it changes its own source code? Its code stored in memory? You could argue that changing anything in memory is in a certain sense generating code.
If it can’t generate code, it’ll be a 1 shot type of thing. Which means that it must be preprogrammed with the tools to do its job. I can’t come up with any way for it to take control, but it doesn’t seem that hard to come up with some doomsday machine scenarios. E.g. smashing a comet into earth, or making a virus that sterilizes everyone. Or a Shiri’s Scissor could do the trick. The idea being to make something that doesn’t have to learn or improve itself too much.
I was thinking of “something that can understand and write code at the level of a 10x SWE”. I’m further assuming that human designers didn’t give it functions to copy itself or other dumb things.