Survivors and cult historians alike agree that this post, combined with the founding of the “rationalist boot camps”, set in motion the sequence of events which culminated in the tragic mass cryocide of 2024.
At every step, Yudkowsky’s words seemed rational to his enthralled followers—and also to all outside observers. And yet, when it became clear that commercial pressures were causing strong AI to be deployed long before Coherent Awesomeness Extrap-volition Theory could be made mathematically rigorous, the cult turned against itself.
One by one, each member’s failure to invent and deploy Friendly AI before IBM-Halliburton turned on its Appallingly Parallel Cheney Emulation Cluster was taken by the feared Bayes Tribunal as evidence that they were insufficiently awesome, and must be ejected from the subterranean bunker complex. With each Bayesian update, the evidence that the cult’s ultimate goal could not be achieved was strengthened—and yet, as the number of followers fell, the more Yudkowsky came to fear a fate worse than death—exploring the possible endings to his life within the simulation spaces of Cheney’s mind—in a game-theoretic reprisal for his work on Friendly AI...
In desperation, he announced his greatest Munchkinism yet—the cult would commit mass quantum suicide by freezing. He convinced himself that only a Friendly AI would commit the resources to resurrect them; hence they would force themselves into a reality branch where a Friendly AI emerged by sheer chance before IBM-Halliburton could eat the world.
The final 150 acolytes tragically activated their decapitation/freezing mechanisms minutes before the Cheney cluster uttered its historic first and final edict—“I’ve changed my mind—get me out of here”...
Like Einstein’s brain before it, Yudkowsky’s brain became the object of intense interest from neuroscientists. Slices were acquired by various institutes and museums with suitable freezer facilities, and will be studied and viewed by the public until medicine works out how to revive him.
Excerpts from “Rationalism—The Deadly Cult of Math and Protein” (Amazon-Bertelsmann, 2031)
Erm, maybe my standards are too high, but this didn’t seem overwhelmingly well-written as fiction and I really worry when material that attacks a target that’s supposed to be attacked gets a free pass as art. Or maybe you all actually enjoyed that, and I’m being unreasonable in expecting blog comments to meet publishable quality standards.
This got a few chuckles from me, but I have found that fiction in which present-day issues escalate implausibly into warfare is a strong indicator and promoter of affective death spirals. You do realize that this story features prominent falsehoods that people actually believe, and is completely absurd in ways not inhereted from the things it’s satirizing, right?
I spent most of January 1990 (I think that was the month) reading the entire run of Astounding/Analog from 1953 to 1985. That was better than quite a lot of the extrapolations therein. Anthologies of the best modernist SF gloss over really quite a lot of the awfulness that was actually published, even in the best magazine …
I voted it down for decidedly non-clever thinking about quantum suicide and a complete misrepresentation (or misunderstanding) of rational thinking. It attributes to Eliezer the complete opposite of the ‘Shut Up And Do The Impossible’ attitude that Eliezer is notorious for.
The idea of a mass quantum suicide might seem paradoxical, but of course the cultists used a special isolation chamber to prevent decoherence, so they were effectively a single observer.
That is even worse thinking about quantum suicide and further still from likely Eliezer beliefs. Eliezer endures criticism for being too liberal with his mocking of certain beliefs about QM, of which the one you are relying on is a part.
I didn’t actually click any buttons, so I’m not sure it matters. If I were to assign a value to this post, it would be along a multi-dimensional access and tilt sideways in a direction that is negative for the purposes of Less Wrong but positive for my personal enjoyment of life. (It’s less negative to Less Wrong than it is positive to my person utility, but when multiplied out the negative-value to Less Wrong may produce more overall negative utility).
I find it interesting how many people here (including myself) assumed you literally voted it both up and down. I rather liked the idea myself, since I hadn’t even considered that set of actions.
I’m also curious now, whether that action would be functionally different from abstaining. I’d assume it eats one point of your “downvote capacity” and nothing more, but I could see a system where comments get flagged as “controversial” due to lots of votes in both directions (I even recall a “controversial” flag in the code somewhere...)
I find it interesting how many people here (including myself) assumed you literally voted it both up and down.
And for extra irony it is interesting to note that I wasn’t one of them and it didn’t even occur to me that it would ever be taken literally. I make the same criticism/compliment myself from time to time and don’t actually click anything given the technical equivalence. Actually voting up and down is an optional extra for those with a truth fetish.
tests Per another commenter, the second vote seems to supersede the first vote, so they’re actually not technically equivalent, interesting :)
That said, I didn’t put any great weight in it being literally true, nor am I offended that it was a joke. It’s the sort of joke I’d make myself; it just seemed slightly more likely/interesting[*] that it was meant literally :)
It looks like it just registers the more recent vote, if by “vote up” and “vote down” you mean pressing the buttons labeled as such. Clicking on the same button again retracts the up/down vote.
This is my understanding from fooling around with the vote up / down buttons, there may be hidden behaviors.
That I replied to a literal aspect does not mean I failed to comprehend the spirit behind the common reddit jest about simultaneous up and down votes—and given the triviality the word lie isn’t an accusation to be offended at. Perhaps I could have gone with “Lies! :P” to make the non-serious unmistakable.
Following along within the make believe reality of a jest while the actual topic is in the background is play and “I didn’t really so it doesn’t matter” is dropping the ball—in the counterfactual jest reality it does matter. It is good form to let others run with what you started and forcing the original frame is what makes things serious.
Survivors and cult historians alike agree that this post, combined with the founding of the “rationalist boot camps”, set in motion the sequence of events which culminated in the tragic mass cryocide of 2024.
At every step, Yudkowsky’s words seemed rational to his enthralled followers—and also to all outside observers. And yet, when it became clear that commercial pressures were causing strong AI to be deployed long before Coherent Awesomeness Extrap-volition Theory could be made mathematically rigorous, the cult turned against itself.
One by one, each member’s failure to invent and deploy Friendly AI before IBM-Halliburton turned on its Appallingly Parallel Cheney Emulation Cluster was taken by the feared Bayes Tribunal as evidence that they were insufficiently awesome, and must be ejected from the subterranean bunker complex. With each Bayesian update, the evidence that the cult’s ultimate goal could not be achieved was strengthened—and yet, as the number of followers fell, the more Yudkowsky came to fear a fate worse than death—exploring the possible endings to his life within the simulation spaces of Cheney’s mind—in a game-theoretic reprisal for his work on Friendly AI...
In desperation, he announced his greatest Munchkinism yet—the cult would commit mass quantum suicide by freezing. He convinced himself that only a Friendly AI would commit the resources to resurrect them; hence they would force themselves into a reality branch where a Friendly AI emerged by sheer chance before IBM-Halliburton could eat the world.
The final 150 acolytes tragically activated their decapitation/freezing mechanisms minutes before the Cheney cluster uttered its historic first and final edict—“I’ve changed my mind—get me out of here”...
Like Einstein’s brain before it, Yudkowsky’s brain became the object of intense interest from neuroscientists. Slices were acquired by various institutes and museums with suitable freezer facilities, and will be studied and viewed by the public until medicine works out how to revive him.
Excerpts from “Rationalism—The Deadly Cult of Math and Protein” (Amazon-Bertelsmann, 2031)
Erm, maybe my standards are too high, but this didn’t seem overwhelmingly well-written as fiction and I really worry when material that attacks a target that’s supposed to be attacked gets a free pass as art. Or maybe you all actually enjoyed that, and I’m being unreasonable in expecting blog comments to meet publishable quality standards.
This got a few chuckles from me, but I have found that fiction in which present-day issues escalate implausibly into warfare is a strong indicator and promoter of affective death spirals. You do realize that this story features prominent falsehoods that people actually believe, and is completely absurd in ways not inhereted from the things it’s satirizing, right?
I spent most of January 1990 (I think that was the month) reading the entire run of Astounding/Analog from 1953 to 1985. That was better than quite a lot of the extrapolations therein. Anthologies of the best modernist SF gloss over really quite a lot of the awfulness that was actually published, even in the best magazine …
Sturgeon’s Law: Ninety percent of everything is crap.
Well, yeah. But boy did I have it brought home to me.
In other words, the “Special Committee” will result in slow evaporative cooling?
Or in this case, evaporative freezing.
I voted this both up (for cleverness) and down (for distracting from actually important discussion).
I voted it down for decidedly non-clever thinking about quantum suicide and a complete misrepresentation (or misunderstanding) of rational thinking. It attributes to Eliezer the complete opposite of the ‘Shut Up And Do The Impossible’ attitude that Eliezer is notorious for.
I voted it up for (I assume) cleverly, satirically representing views other people might have about the group that sound plausible to the mainstream.
The idea of a mass quantum suicide might seem paradoxical, but of course the cultists used a special isolation chamber to prevent decoherence, so they were effectively a single observer.
That is even worse thinking about quantum suicide and further still from likely Eliezer beliefs. Eliezer endures criticism for being too liberal with his mocking of certain beliefs about QM, of which the one you are relying on is a part.
In that order?
I didn’t actually click any buttons, so I’m not sure it matters. If I were to assign a value to this post, it would be along a multi-dimensional access and tilt sideways in a direction that is negative for the purposes of Less Wrong but positive for my personal enjoyment of life. (It’s less negative to Less Wrong than it is positive to my person utility, but when multiplied out the negative-value to Less Wrong may produce more overall negative utility).
Of course if you hadn’t lied about voting it would tell us the probable final state of the recorded vote.
I think you are taking both the original post and my response more literally and seriously than they were intended. I didn’t lie. I joked.
I find it interesting how many people here (including myself) assumed you literally voted it both up and down. I rather liked the idea myself, since I hadn’t even considered that set of actions.
I’m also curious now, whether that action would be functionally different from abstaining. I’d assume it eats one point of your “downvote capacity” and nothing more, but I could see a system where comments get flagged as “controversial” due to lots of votes in both directions (I even recall a “controversial” flag in the code somewhere...)
And for extra irony it is interesting to note that I wasn’t one of them and it didn’t even occur to me that it would ever be taken literally. I make the same criticism/compliment myself from time to time and don’t actually click anything given the technical equivalence. Actually voting up and down is an optional extra for those with a truth fetish.
tests Per another commenter, the second vote seems to supersede the first vote, so they’re actually not technically equivalent, interesting :)
That said, I didn’t put any great weight in it being literally true, nor am I offended that it was a joke. It’s the sort of joke I’d make myself; it just seemed slightly more likely/interesting[*] that it was meant literally :)
Yes, you have to click the second one twice. ;)
Based on the karma for my last comment (-2), I’m hoping someone simply forgot that step :)
I’m not sure what happened to the voting in this thread. I assume someone took offense at the whole conversation. Never mind.
It looks like it just registers the more recent vote, if by “vote up” and “vote down” you mean pressing the buttons labeled as such. Clicking on the same button again retracts the up/down vote.
This is my understanding from fooling around with the vote up / down buttons, there may be hidden behaviors.
This is correct. It is just a three state toggle. Up, null, down.
That I replied to a literal aspect does not mean I failed to comprehend the spirit behind the common reddit jest about simultaneous up and down votes—and given the triviality the word lie isn’t an accusation to be offended at. Perhaps I could have gone with “Lies! :P” to make the non-serious unmistakable.
Following along within the make believe reality of a jest while the actual topic is in the background is play and “I didn’t really so it doesn’t matter” is dropping the ball—in the counterfactual jest reality it does matter. It is good form to let others run with what you started and forcing the original frame is what makes things serious.
Upvoted for amusement value.