Robert Aumann has proven that ideal Bayesians cannot disagree with Eliezer Yudkowsky. Eliezer Yudkowsky can make AIs Friendly by glaring at them. Angering Eliezer Yudkowsky is a global existential risk Eliezer Yudkowsky thought he was wrong one time, but he was mistaken. Eliezer Yudkowsky predicts Omega’s actions with 100% accuracy An AI programmed to maximize utility will tile the Universe with tiny copies of Eliezer Yudkowksy.
Eliezer Yudkowsky can make AIs Friendly by glaring at them.
And the first action of any Friendly AI will be to create a nonprofit institute to develop a rigorous theory of Eliezer Yudkowsky. Unfortunately, it will turn out to be an intractable problem.
Eliezer can in fact tile the Universe with himself, simply by slicing himself into finitely many pieces. The only reason the rest of us are here is quantum immortality.
Ooh, this is fun.
Robert Aumann has proven that ideal Bayesians cannot disagree with Eliezer Yudkowsky.
Eliezer Yudkowsky can make AIs Friendly by glaring at them.
Angering Eliezer Yudkowsky is a global existential risk
Eliezer Yudkowsky thought he was wrong one time, but he was mistaken.
Eliezer Yudkowsky predicts Omega’s actions with 100% accuracy
An AI programmed to maximize utility will tile the Universe with tiny copies of Eliezer Yudkowksy.
And the first action of any Friendly AI will be to create a nonprofit institute to develop a rigorous theory of Eliezer Yudkowsky. Unfortunately, it will turn out to be an intractable problem.
Transhuman AIs theorize that if they could create Eliezer Yudkowsky, it would lead to an “intelligence explosion”.
… because all of them are Eliezer Yudkowsky.
They call it “spontaneous symmetry breaking”, because Eliezer Yudkowsky just felt like breaking something one day.
Particles in parallel universes interfere with each other all the time, but nobody interferes with Eliezer Yudkowsky.
An oracle for the Halting Problem is Eliezer Yudkowsky’s cellphone number.
When tachyons get confused about their priors and posteriors, they ask Eliezer Yudkowsky for help.
Where’s the punch line?
Don’t make Eliezer Yudkowsky a global existential risk. You wouldn’t like him when he’s a global existential risk.
Eliezer can in fact tile the Universe with himself, simply by slicing himself into finitely many pieces. The only reason the rest of us are here is quantum immortality.
“An AI programmed to maximize utility will tile the Universe with tiny copies of Eliezer Yudkowksy.”
This one aged well:
https://www.smbc-comics.com/comic/ai-6