How many Singularitarians does it take to change a light bulb?
Zero; they let their extrapolated volition decide whether to change it.
I’m sorry, but this is exactly what I mean by overly obvious failed jokes. “Singularitarian” associates to—return the first cached thought—“extrapolated volition”. There’s no gotcha, no surprise.
Suppose you ask:
Q: How many Eliezer Yudkowskys does it take to change a light bulb?
If you let your mind return the first cached answer, it will come out something like:
A: Two. One to change the light bulb, and one to say something about the Singularity or rationality.
Hahaha dull thud.
You’ve got to say something non-obvious like:
A: One, but he has to write another twenty Overcoming Bias posts before he gets there.
A: One, because the thought of two or more Eliezer Yudkowskys is too terrifying to even contemplate.
A: The problem of changing a single light bulb without turning the whole universe into light bulbs involves so many hidden difficulties that you essentially have to write a complete Friendly AI.
A: The thirty-seventh virtue of changing light bulbs is the little screech it makes when you screw it in.
A: Two, because if you just said “one”, it wouldn’t be funny.
How many Singularitarians does it take to change a light bulb? Zero; they let their extrapolated volition decide whether to change it.
I’m sorry, but this is exactly what I mean by overly obvious failed jokes. “Singularitarian” associates to—return the first cached thought—“extrapolated volition”. There’s no gotcha, no surprise.
Suppose you ask:
Q: How many Eliezer Yudkowskys does it take to change a light bulb?
If you let your mind return the first cached answer, it will come out something like:
A: Two. One to change the light bulb, and one to say something about the Singularity or rationality.
Hahaha dull thud.
You’ve got to say something non-obvious like:
A: One, but he has to write another twenty Overcoming Bias posts before he gets there.
A: One, because the thought of two or more Eliezer Yudkowskys is too terrifying to even contemplate.
A: The problem of changing a single light bulb without turning the whole universe into light bulbs involves so many hidden difficulties that you essentially have to write a complete Friendly AI.
A: The thirty-seventh virtue of changing light bulbs is the little screech it makes when you screw it in.
A: Two, because if you just said “one”, it wouldn’t be funny.
Etc. Think past the first thought!
Q: How many Eliezer Yudkowskys does it take to change a light bulb?
A: His mind only needs to impose the ‘triangular’ concept on a light bulb, and then the light bulb changes by itself.
One, maybe two if he was planning to destroy the world when he started.
I know there’s some pun in here about the light bulbs as ideas motif
Q.How many Eliezer Yudkowskys does it take to change a light bulb?
A. When it comes on, will you sign up for cryonics?