That’s… the opposite of what I was looking for. It’s pretty bad writing, and it’s got the Mind Projection Fallacy written all over it. (Skynet is unhappy and worrying about the meaning of good and evil?)
yeah, like I said, it is pretty bad. But imagine rewriting that story to make it more realistic. It would become:
and then skynet misinterpreted one of its instructions, and decided that its mission was to wipe out all of humanity, which it did with superhuman speed and efficiency. The end
I’ve seen some bad ones~:
http://www.goingfaster.com/term2029/skynet.html
That’s… the opposite of what I was looking for. It’s pretty bad writing, and it’s got the Mind Projection Fallacy written all over it. (Skynet is unhappy and worrying about the meaning of good and evil?)
yeah, like I said, it is pretty bad. But imagine rewriting that story to make it more realistic. It would become:
Ironically, a line from the original Terminator movie is a pretty good intuition pump for Powerful Optimization Processes:
It can’t be bargained with. It can’t be ‘reasoned’ with. It doesn’t feel pity or remorse or fear and it absolutely will not stop, ever, until [it achieves its goal].