yeah, like I said, it is pretty bad. But imagine rewriting that story to make it more realistic. It would become:
and then skynet misinterpreted one of its instructions, and decided that its mission was to wipe out all of humanity, which it did with superhuman speed and efficiency. The end
yeah, like I said, it is pretty bad. But imagine rewriting that story to make it more realistic. It would become:
Ironically, a line from the original Terminator movie is a pretty good intuition pump for Powerful Optimization Processes:
It can’t be bargained with. It can’t be ‘reasoned’ with. It doesn’t feel pity or remorse or fear and it absolutely will not stop, ever, until [it achieves its goal].