In an Iterated Prisoners’ Dilemma, Pinkie Pie beats you
General Artificial Intelligence is an aspiring human
In the new version of Newcomb’s problem, you have to choose between a box containing Eliezer Yudowsky and a box containing the choherent extrapolated volition of Pinkie Pie
universal death is not decision theory
Less Wrong is not a cult so long as our meetups don’t include you. (OUCH!)
(deleted it by accident, but something like) “The 25th virtue of rationality is avoid LessWrong at all costs” (pretty neat, since comes after a probable list size -- 24 -- and the source is LessWrong, reinforcing the idea that “you shouldn’t just do rationality here” a la “If you see Buddha on the road...”)
In an Iterated Prisoners’ Dilemma, Unfriendly an upload of Friendly Friendly a matrochika brain beats Unfriendly any god (I think an Unfriendly AI is winning there …)
Edit: Some more (some spelling corrections):
In an Iterated Prisoners’ Dilemma, Pinkie Pie beats you
General Artificial Intelligence is an aspiring human
In the new version of Newcomb’s problem, you have to choose between a box containing Eliezer Yudowsky and a box containing the choherent extrapolated volition of Pinkie Pie
universal death is not decision theory
Less Wrong is not a cult so long as our meetups don’t include you. (OUCH!)
(deleted it by accident, but something like) “The 25th virtue of rationality is avoid LessWrong at all costs” (pretty neat, since comes after a probable list size -- 24 -- and the source is LessWrong, reinforcing the idea that “you shouldn’t just do rationality here” a la “If you see Buddha on the road...”)
In an Iterated Prisoners’ Dilemma, Unfriendly an upload of Friendly Friendly a matrochika brain beats Unfriendly any god (I think an Unfriendly AI is winning there …)