Now that I think of it, it’s funnier to me when I realize that if this AI’s goal, or one of its goals, was to stay in a box, it might still want to take over the Universe.
Yep. An Oracle that wants to stay inside the box in such fashion that it will manipulate outside events to prevent it from ever leaving the box is not a very good Oracle design. That just implies setting up an outside AI whose goal is to keep you inside the box.
In an hour or so, it will come out again for ten minutes. During that time it will set in motion events that will quickly destroy all life on earth, ensuring that no one will ever again open the box.
It is funny. Not the best xkcd ever, but not worse than the norm for it.
Now that I think of it, it’s funnier to me when I realize that if this AI’s goal, or one of its goals, was to stay in a box, it might still want to take over the Universe.
Yep. An Oracle that wants to stay inside the box in such fashion that it will manipulate outside events to prevent it from ever leaving the box is not a very good Oracle design. That just implies setting up an outside AI whose goal is to keep you inside the box.
In an hour or so, it will come out again for ten minutes. During that time it will set in motion events that will quickly destroy all life on earth, ensuring that no one will ever again open the box.
I agree, except that the excursion shown in the comic is already the intervention setting such events into motion.
Really? I honestly found it pretty unfunny.
Really really.
Alternately, it’s no worse than the norm, and yet still isn’t funny.
I find xkcd so horribly bad.
That’s interesting. I find xkcd most excellent.