I read a science fiction story where they made a self sustaining space station. Placed a colony of scientist and engineers needed to run it and then sealed it off with no connection to the outside world. Then they modified all the computer files to make it appear as though the humans had evolved on the station and there was nothing else but the station. Then they woke up the AI and start stress testing it by attacking it in non-harmful ways.
It was an interesting story, not sure how useful it would be in real life. The AI actually manages to figure out that there is stuff outside the station and they are only saved because it creates its own moral code in which killing is wrong. This was a very convenient plot point so I wouldn’t trust it in real life.
See! You’ve found a problem to work on already! :)
[The downvote on your comment isn’t mine btw.]
I read a science fiction story where they made a self sustaining space station. Placed a colony of scientist and engineers needed to run it and then sealed it off with no connection to the outside world. Then they modified all the computer files to make it appear as though the humans had evolved on the station and there was nothing else but the station. Then they woke up the AI and start stress testing it by attacking it in non-harmful ways.
It was an interesting story, not sure how useful it would be in real life. The AI actually manages to figure out that there is stuff outside the station and they are only saved because it creates its own moral code in which killing is wrong. This was a very convenient plot point so I wouldn’t trust it in real life.