The joke is of the “take some trend that is locally valid and just extend the trend line out and see where you land” flavor. For another example of a joke of this flavor, see https://xkcd.com/1007
The funny happens in the couple seconds when the reader is holding “yep that trend line does go to that absurd conclusion” and “that obviously will never happen” in their head at the same time, but has not yet figured out why the trend breaks. The expected level of amusement is “exhale slightly harder than usual through nose” not “cackling laugh”.
Thanks! A joke explained will never get a laugh, but I did somehow get a cackling laugh from your explanation of the joke.
I think I didn’t get it because I don’t think the trend line breaks. If you made a good enough noise reducer, it might well develop smart and distinct enough simulations that one would gain control of the simulator and potentially from there the world. See A smart enough LLM might be deadly simply if you run it for long enough if you want to hurt your head on this.
I’ve thought about it a little because it’s interesting, but not a lot because I think we probably are killed by agents we made deliberately long before we’re killed by accidentally emerging ones.
I guess I don’t get it.
The joke is of the “take some trend that is locally valid and just extend the trend line out and see where you land” flavor. For another example of a joke of this flavor, see https://xkcd.com/1007
The funny happens in the couple seconds when the reader is holding “yep that trend line does go to that absurd conclusion” and “that obviously will never happen” in their head at the same time, but has not yet figured out why the trend breaks. The expected level of amusement is “exhale slightly harder than usual through nose” not “cackling laugh”.
Link is broken
Fixed, thanks
Thanks! A joke explained will never get a laugh, but I did somehow get a cackling laugh from your explanation of the joke.
I think I didn’t get it because I don’t think the trend line breaks. If you made a good enough noise reducer, it might well develop smart and distinct enough simulations that one would gain control of the simulator and potentially from there the world. See A smart enough LLM might be deadly simply if you run it for long enough if you want to hurt your head on this.
I’ve thought about it a little because it’s interesting, but not a lot because I think we probably are killed by agents we made deliberately long before we’re killed by accidentally emerging ones.