As much as I’m a regular xkcd reader, I’m mildly annoyed with this strip, because I imagine lots of people will be exposed to the idea of the AI-box experiment for the first time through it, and they’ll get this exposure together with an unimportant, extremely speculative idea that they’re helpfully informed you’re meant to make fun of. Like, why even bring the basilisk up? What % of xkcd readers will even know what it is?
If the strip was also clever or funny, I’d see the point, but as it’s not, I don’t.
Now that I think of it, it’s funnier to me when I realize that if this AI’s goal, or one of its goals, was to stay in a box, it might still want to take over the Universe.
Yep. An Oracle that wants to stay inside the box in such fashion that it will manipulate outside events to prevent it from ever leaving the box is not a very good Oracle design. That just implies setting up an outside AI whose goal is to keep you inside the box.
In an hour or so, it will come out again for ten minutes. During that time it will set in motion events that will quickly destroy all life on earth, ensuring that no one will ever again open the box.
If you mean many esoteric or unknown problems get presented in a lighthearted way, sure.
If you mean they get presented together/associated with a second, separate, and much less worthwhile problem, and explicitely advised in the comic’s hiddentext “this stuff is mockable”, not so sure.
As much as I’m a regular xkcd reader, I’m mildly annoyed with this strip, because I imagine lots of people will be exposed to the idea of the AI-box experiment for the first time through it, and they’ll get this exposure together with an unimportant, extremely speculative idea that they’re helpfully informed you’re meant to make fun of. Like, why even bring the basilisk up? What % of xkcd readers will even know what it is?
If the strip was also clever or funny, I’d see the point, but as it’s not, I don’t.
It is funny. Not the best xkcd ever, but not worse than the norm for it.
Now that I think of it, it’s funnier to me when I realize that if this AI’s goal, or one of its goals, was to stay in a box, it might still want to take over the Universe.
Yep. An Oracle that wants to stay inside the box in such fashion that it will manipulate outside events to prevent it from ever leaving the box is not a very good Oracle design. That just implies setting up an outside AI whose goal is to keep you inside the box.
In an hour or so, it will come out again for ten minutes. During that time it will set in motion events that will quickly destroy all life on earth, ensuring that no one will ever again open the box.
I agree, except that the excursion shown in the comic is already the intervention setting such events into motion.
Really? I honestly found it pretty unfunny.
Really really.
Alternately, it’s no worse than the norm, and yet still isn’t funny.
I find xkcd so horribly bad.
That’s interesting. I find xkcd most excellent.
To be fair, I’d say that happens with many esoteric or unknown problems that are presented in the comic
If you mean many esoteric or unknown problems get presented in a lighthearted way, sure.
If you mean they get presented together/associated with a second, separate, and much less worthwhile problem, and explicitely advised in the comic’s hiddentext “this stuff is mockable”, not so sure.
Yeh, that’s why I stopped reading xkcd.