At this point, there should be little doubt that the best response to this “basilisk” would have been “That’s stupid. Here are ten reasons why.”, rather than (paraphrasing for humor) “That’s getting erased from the internet. No, I haven’t heard the phrase ‘Streisand Effect’ before; why do you ask?”
The real irony is that Eliezer is now a fantastic example of the commitment/sunk cost effect which he has warned against repeatedly: having made an awful decision, and followed it up with further awful decisions over years (including at least 1 Discussion post deleted today and an expansion of topics banned on LW; incidentally, Eliezer, if you’re reading this, please stop marking ‘minor’ edits on the wiki which are obviously not minor), he is trapped into continuing his disastrous course of conduct and escalating his interventions or justifications.
And now the basilisk and the censorship are an established part of the LW or MIRI histories which no critic could possibly miss, and which pattern-matches on religion. (Stross claims that it indicates that we’re “Calvinist”, which is pretty hilarious for anyone who hasn’t drained the term of substantive meaning and turned it into a buzzword for people they don’t like.) A pity.
While we’re on the topic, I also blame Yvain to some extent; if he had taken my suggestion to add a basilisk question to the past LW survey, it would be much easier to go around to all the places discussing it and say something like ‘this is solely Eliezer’s problem; 98% disagree with censoring it’. But he didn’t, and so just as I predicted, we have lost a powerful method of damage control.
Let me consult my own crystal ball… Yes, the mists of time are parting. I see… I see… I see, a few years from now, a TED panel discussion on “Applied Theology”, chaired by Vernor Vinge, in which Eliezer, Roko, and Will Newsome discuss the pros and cons of life in an acausal multiverse of feuding superintelligences.
When you’re looking at consolidated diffs, it does. Double-checking, your last edit was marked minor, so I guess there was nothing you could’ve done there.
(It is good wiki editing practice to always make the minor or uncontroversial edits first, so that way your later edits can be looked at without the additional clutter of the minor edits or they can be reverted with minimal collateral damage, but that’s not especially relevant in this case.)
And now the basilisk and the censorship are an established part of the LW or MIRI histories which no critic could possibly miss, and which pattern-matches on religion.
That’s already true without the basilisk and censorship. The similarities between transhumanism and religion have been remarked on for about as long as transhumanism has been a thing.
I also blame Yvain to some extent; if he had taken my suggestion to add a basilisk question to the past LW survey, it would be much easier to go around to all the places discussing it and say something like ‘this is solely Eliezer’s problem; 98% disagree with censoring it’. But he didn’t.
Also, note that this wasn’t an unsolicited suggestion: in the post to which gwern’s comment was posted, Yvan actually said that he was “willing to include any question you want in the Super Extra Bonus Questions section [of the survey], as long as it is not offensive, super-long-and-involved, or really dumb.” And those are Yvain’s italics.
I also blame Yvain to some extent; if he had taken my suggestion to add a basilisk question to the past LW survey,
Then EY would have freaked the hell out, and I don’t know what the consequences of that would be but I don’t think they would be good. Also, I think the basilisk question would have had lots of mutual information with the troll toll question anyway: [pollid:419]
It’s too late. This poll is in the wrong place (attracting only those interested in it), will get too few responses (certainly not >1000), and is now obviously in reaction to much more major coverage than before so the responses are contaminated.
The Moving Finger writes; and, having writ, Moves on: nor all thy Piety nor Wit, Shall lure it back to cancel half a Line, Nor all thy Tears wash out a Word of it.
Actually, I was hoping to find some strong correlation between support for the troll toll and support for the basilisk censorship so that I could use the number of people who would have supported the censorship from the answers to the toll question in the survey. But it turns out that the fraction of censorship supporters is about 30% both among toll supporters and among toll opposers. (But the respondents to my poll are unlikely to be an unbiased sample of all LWers.)
Then EY would have freaked the hell out, and I don’t know what the consequences of that would be but I don’t think they would be good. Also, I think the basilisk question would have had lots of mutual information with the troll toll question anyway:
The ‘troll toll’ question misses most of the significant issue (as far as I’m concerned). I support the troll toll but have nothing but contempt for Eliezer’s behavior, comments, reasoning and signalling while implementing the troll toll. And in my judgement most of the mutual information with the censorship or Roko’s Basilisk is about those issues (things like overconfidence, and various biases of the kind Gwern describes) is to do with the judgement of competence based on that behavior rather than the technical change to the lesswrong software.
Just to be charitable to Eliezer, let me remind you of this quote. For example, can you conceive of a reason (not necessarily the officially stated one) that the actual basilisk discussion ought to be suppressed, even at the cost of the damage done to LW credibility (such as it is) by an offsite discussion of such suppression?
Stross claims that it indicates that we’re “Calvinist”
I thought this is more akin to Scientology, where any mention of Xenu to the uninitiated ought to be suppressed.
It sucks being Cassandra.
Sure does. Then again, it probably sucks more being Laocoön.
can you conceive of a reason (not necessarily the officially stated one) that the actual basilisk discussion ought to be
suppressed, even at the cost of the damage done to LW credibility (such as it is) by an offsite discussion of such
suppression?
The basilisk is harmless. Eliezer knows this. The streisand effect was the intended consequence of the censor. The hope is that people who become aware of the basilisk will increase their priors for the existence of real information hazards, and will in the future be less likely to read anything marked as such. It’s all a clever memetic inoculation program!
Another possibility: Eliezer doesn’t object to the meme that anyone who doesn’t donate to SIAI/MIRI will spend eternity in hell being spread in a deniable way.
We are the hollow men / we are the stuffed men / Leaning together / Headpiece filled with straw. Alas! / Our dried comments when / we discuss together / Are quiet and meaningless / As median-cited papers / or reports of supplements / on the Internet.
Another possibility: Eliezer does not want the meme to be associated with LW. Because, even if it was written by someone else, most people are predictably likely to read it and remember: “This is an idea I read on LW, so this must be what they believe.”
The hope is that people who become aware of the basilisk will increase their priors for the existence of real information hazards, and will in the future be less likely to read anything marked as such. It’s all a clever memetic inoculation program!
It’s certainly an inoculation for information hazards. Or at least against believing information hazard warnings.
Alternatively, the people dismissing the idea out of hand are not taking it seriously and thus not triggering the information hazard.
Also the censorship of the basilisk was by no means the most troubling part of the Roko incident, and as long as people focus on that they’re not focusing on the more disturbing issues.
Edit: The most troubling part were some comments, also deleted, indicating just how fanatically loyal some of Eliezer’s followers are.
Just to be charitable to Eliezer, let me remind you of this quote. For example, can you conceive of a reason (not necessarily the officially stated one) that the actual basilisk discussion ought to be suppressed, even at the cost of the damage done to LW credibility (such as it is) by an offsite discussion of such suppression?
No. I have watched Eliezer make this unforced error now for years, sliding into an obvious and common failure mode, with mounting evidence that censorship is, was, and will be a bad idea, and I have still not seen any remotely plausible explanation for why it’s worthwhile.
Just to take this most recent Stross post: he has similar traffic to me as far as I can tell, which means that since I get ~4000 unique visitors a day, he gets as many and often many more. A good chunk will be to his latest blog post, and it will go on being visited for years on end. If it hits the front page of Hacker News as more than a few of his blog posts do, it will quickly spike to 20k+ uniques in just a day or two. (In this case, it didn’t.) So we are talking, over the next year, easily 100,000 people being exposed to this presentation of the basilisk (just need average 274 uniques a day). 100k people being exposed to something which will strike them as patent nonsense, from a trusted source like Stross.
So maybe there used to be some sort of justification behind the sunk costs and obtinacy and courting of the Streisand effect. Does this justification also justify trashing LW/MIRI’s reputation among literally hundreds of thousands of people?
You may have a witty quote, which is swell, but I’m afraid it doesn’t help me see what justification there could be.
Sure does. Then again, it probably sucks more being Laocoön.
Laocoön died quickly and relatively cleanly by serpent; Cassandra saw all her predictions (not just one) come true, was raped, abducted, kept as a concubine, and then murdered.
I banned the last discussion post on the Basilisk, not Eliezer. I’ll let this one stand for now as you’ve put some effort into this post. However, I believe that these meta discussions are as annoyingly toxic as anything at all on Less Wrong. You are not doing yourself or anyone else any favors by continuing to ride this.
The reputational damage to Less Wrong has been done. Is there really anything to be gained by flipping moderation policy?
At this point, let’s not taunt people with the right kind of mental pathology to be made very uncomfortable by the basilisk or meta-set of basilisks.
The reputational damage to Less Wrong has been done. Is there really anything to be gained by flipping moderation policy?
There’s now the impression that a community of aspiring rationalists — or, at least, its de-facto leaders — are experiencing an ongoing lack of clue on the subject of the efficacy of censorship on online PR.
The “reputational damage” is not just “Eliezer or LW have this kooky idea.”
It is ”… and they think there is something to be gained by shutting down discussion of this kooky idea, when others’ experience (Streisand Effect, DeCSS, etc.) and their own (this very thread) are strong evidence to the contrary.”
It is the apparent failure to update — or to engage with widely-recognized reality at all — that is the larger reputational damage.
It is, for that matter, the apparent failure to realize that saying “Don’t talk about this because it is bad PR” is itself horrible PR.
The idea that LW or its leadership dedicate nontrivial attention to encircling and defending against this kooky idea makes it appear that the idea is central to LW. Some folks on the thread on Stross’s forum seem to think that Roko discovered the hidden secret motivating MIRI! That’s bogus … but there’s a whole trope of “cults” suppressing knowledge of their secret teachings; someone who’s pattern-matched LW or transhumanism onto “cult” will predictably jump right there.
At this point, let’s not taunt people with the right kind of mental pathology to be made very uncomfortable by the basilisk or meta-set of basilisks.
My own take on the whole subject is that basilisk-fear is a humongous case of privileging the hypothesis coupled to an anxiety loop. But … I’m rather prone to anxiety loops myself, albeit over matters a little more personal and less abstract. The reason not to poke people with Roko’s basilisk is that doing so a form of aggression — it makes (some) people unhappy.
But as far as I can tell, it’s no worse in that regard than a typical Iain M. Banks novel, or some of Stross’s own ideas for that matter … which are considered entertainment. Which means … humans eat “basilisks” like this for dessert. In one of Banks’s novels, multiple galactic civilizations invent uploading, and use it to implement their religions’ visions of Hell, to punish the dead and provide an incentive to the living to conform to moral standards.
(But then, I read Stross and Banks. I don’t watch gore-filled horror movies, though, and I would consider someone forcing me to watch such a movie would be doing aggression against me. So I empathize with those who are actually distressed by the basilisk idea, or the “basilisk” idea for that matter.)
I have to say, I find myself feeling worse for Eliezer than for anyone else in this whole affair. Whatever else may be going on here, having one’s work cruelly mischaracterized and held up to ridicule is a whole bunch of no fun.
having one’s work cruelly mischaracterized and held up to ridicule is a whole bunch of no fun.
Thank you for appreciating this. I expected it before I got started on my life, I’m already accustomed to it by now, I’m sure it doesn’t compare to the pain of starving to death. Since I’m not in any real trouble, I don’t intend to angst about it.
Will an abandonment of a disastrous policy be more or less disastrous? Well, when I put it that way, it suddenly seems obvious.
“The world around us redounds with opportunities, explodes with opportunities, which nearly all folk ignore because it would require them to violate a habit of thought; there are a thousand Hufflepuff bones waiting to be sharpened into spears … I cannot quite comprehend what goes through people’s minds when they repeat the same failed strategy over and over, but apparently it is an astonishingly rare realization that you can try something else.”
Less disastrous as in “people spending less time criticizing Eliezer’s moderating skills”? Probably yes.
Less disastrous as in “people spending less time on LW discussing the ‘basilisk’”? Probably no. I would expect at least dozen articles about this topic within the first year if the ban would be completely removed.
Less disastrous as in “people less likely to create more ‘basilisk’-style comments”? Probably no. Seems that the policy prevented this successfully.
The reputational damage to Less Wrong has been done. Is there really anything to be gained by flipping moderation policy?
Answering the rhetorical question because the obvious answer is not what you imply [EDIT: I notice that J Taylor has made a far superior reply already]: Yes, it limits the ongoing reputational damage.
I’m not arguing with the moderation policy. But I will argue with bad arguments. Continue to implement the policy. You have the authority to do so, Eliezer has the power on this particular website to grant that authority, most people don’t care enough to argue against that behavior (I certainly don’t) and you can always delete the objections with only minimal consequences. But once you choose to make arguments that appeal to reason rather than the preferences of the person with legal power then you can be wrong.
At this point, let’s not taunt people with the right kind of mental pathology to be made very uncomfortable by the basilisk or meta-set of basilisks.
I’ve had people come to me who are traumatised by basilisk considerations. From what I can tell almost all of the trauma is attributable to Eliezer’s behavior. The descriptions of the experience give clear indications (ie. direct self reports that are coherent) that a significant reason that they “take the basilisk seriously” is because Eliezer considers it a sufficiently big deal that he takes such drastic and emotional action. Heck, without Eliezer’s response it wouldn’t even have earned that title. It’d be a trivial backwater game theory question to which there are multiple practical answers.
So please, just go back to deleting basilisk talk. That would be way less harmful than trying to persuade people with reason.
I’ve had people come to me who are traumatised by basilisk considerations. From what I can tell almost all of the trauma is attributable to Eliezer’s behavior. The descriptions of the experience give clear indications (ie. direct self reports that are coherent) that a significant reason that they “take the basilisk seriously” is because Eliezer considers it a sufficiently big deal that he takes such drastic and emotional action. Heck, without Eliezer’s response it wouldn’t even have earned that title. It’d be a trivial backwater game theory question to which there are multiple practical answers.
I get the people who’ve been frightened by it because EY seems to take it seriously too. (Dmytry also gets them, which is part of why he’s so perpetually pissed off at LW. He does his best to help, as a decent person would.) More generally, people distressed by it feel they can’t talk about it on LW, so they come to RW contributors—addressing this was why it was made a separate article. (I have no idea why Warren Ellis then Charlie Stross happened to latch onto it—I wish they hadn’t, because it was totally not ready, so I had to spend the past few days desperately fixing it up, and it’s still terrible.) EY not in fact thinking it’s feasible or important is a point I need to address in the last section of the RW article, to calm this concern.
It would be nice if you’d also address the extent to which it misrepresents other LessWrong contributors as thinking it is feasible or important (sometimes to the point of mocking them based on its own misrepresentation). People around LessWrong engage in hypothetical what-if discussions a lot; it doesn’t mean that they’re seriously concerned.
Lines like “Though it must be noted that LessWrong does not believe in or advocate the basilisk … just in almost all of the pieces that add up to it.” are also pretty terrible given we know only a fairly small percentage of “LessWrong” as a whole even consider unfriendly AI to be the biggest current existential risk. Really, this kind of misrepresentation of alleged, dubiously actually held extreme views as the perspective of the entire community is the bigger problem with both the LessWrong article and this one.
The article is still terrible, but it’s better than it was when Stross linked it. The greatest difficulty is describing the thing and the fuss accurately while explaining it to normal intelligent people without them pattern matching it to “serve the AI God or go to Hell”. This is proving the very hardest part. (Let’s assume for a moment 0% of them will sit down with 500K words of sequences.) I’m trying to leave it for a bit, having other things to do.
At this point, let’s not taunt people with the right kind of mental pathology to be made very uncomfortable by the basilisk or meta-set of basilisks.
As far as I can tell the entire POINT of LW is to talk about various mental pathologies and how to avoid them or understand them even if they make you very uncomfortable to deal with or acknowledge. The reasons behind talking about the basilisk or basilisks in general (apart from metashit about censorship) are just like the reasons for talking about trolley problems even if they make people angry or unhappy. What do you do when your moral intuitions seem to break down? What do you do about compartmentalization or the lack of it? Do you bite bullets? Maybe the mother should be allowed to buy acid.
To get back to meta shit: If people are complaining about the censorship and you are sick of the complaints, the simplest way to stop them is to stop the censorship. If someone tells you there’s a problem, the response of “Quit your bitching, it’s annoying” is rarely appropriate or even reasonable. Being annoying is the point of even lameass activism like this. I personally think any discussion of the actual basilisk has reached every conclusion it’s ever really going to reach by now, pretty reasonably demonstrated by looking at the uncensored thread, and the only thing even keeping it in anyone’s consciousness is the continued ballyhooing about memetic hazards.
I am appalled that you believe this response was remotely appropriate or superior to saying nothing at all. How is it not obvious that once you have publicly put on your hat as an authority you take a modicum of care to make sure you don’t behave like a contemptuous ass?
I wouldn’t call him that, and not because I have any doubt about his trustworthiness. It’s the other word, “source”, that I wouldn’t apply. He’s a professional SF author. His business is to entertain with ideas, and his blog is part of that. I wouldn’t go there in search of serious analysis of anything, any more than I would look for that on RationalWiki. Both the article in question and the comments on it are pretty much on a par with RationalWiki’s approach. In fact (ungrounded speculation alert), I have to wonder how many of the commenters there are RW regulars, there to fan the flame.
for example, can you conceive of a reason (not necessarily the officially stated one) that the actual basilisk discussion ought to be suppressed, even at the cost of the damage done to LW credibility (such as it is) by an offsite discussion of such suppression?
What if he CAN’T conceive a reason? Can you conceive a possibility that it might be for other reason than Gwern being less intelligent then EY? For example, Gwern might be more intelligent than EY.
Discussion post deleted today and an expansion of topics banned on LW
Did you happen to catch the deleted post? Was there any interesting reasoning contained therein? If so, who was the author and did they keep a backup that they would be willing to email me? (If they did not keep a backup… that was overwhelmingly shortsighted unless they are completely unfamiliar with the social context!)
I saw it. I contained just a link and the a line asking for “thoughts” or words to that effect. Maybe there was a quote—certainly nothing new or origional.
I saw it. I contained just a link and the a line asking for “thoughts” or words to that effect. Maybe there was a quote—certainly nothing new or origional.
Thanks. I’ve been sent links to all the recently deleted content and can confirm that nothing groundbreaking was lost.
No, I haven’t heard the phrase ‘Streisand Effect’ before; why do you ask?
I’m not convinced the Streisand Effect is actually real. It seems like an instance of survival bias. After all, you shouldn’t expect to hear about the cases when information was successfully suppressed.
I’m not convinced the Streisand Effect is actually real.
This is a bizarre position to take. The effect does not constitute a claim that all else being equal attempts to suppress information are negatively successful. Instead it describes those cases where information is published more widely due to the suppression attempt. This clearly happens sometimes. The Wikipedia article gives plenty of unambiguous examples.
In April 2007, an attempt at blocking an Advanced Access Content System (AACS) key from being disseminated on Digg caused an uproar when cease-and-desist letters demanded the code be removed from several high-profile websites. This led to the key’s proliferation across other sites and chat rooms in various formats, with one commentator describing it as having become “the most famous number on the internet”. Within a month, the key had been reprinted on over 280,000 pages, printed on T-shirts and tattoos, and had appeared on YouTube in a song played over 45,000 times.
It would be absurd to believe that the number in question would have been made into T-shirts, tattoos and a popular YouTube song no attempt was made to suppress it. That doesn’t mean (or require) that in other cases (and particularly in other cases where the technological and social environment was completely different) that sometimes powerful figures are successful in suppressing information.
At this point, there should be little doubt that the best response to this “basilisk” would have been “That’s stupid. Here are ten reasons why.”, rather than (paraphrasing for humor) “That’s getting erased from the internet. No, I haven’t heard the phrase ‘Streisand Effect’ before; why do you ask?”
Heck, there is little doubt that even your paraphrased humorous alternative would have been much better than what actually happened. It’s not often that satirical caricatures are actually better than what they are based on!
At this point, there should be little doubt that the best response to this “basilisk” would have been “That’s stupid. Here are ten reasons why.
That would only be the best response if the basilisk were indeed stupid, and there were indeed ten good reasons why. Presumably you do think it is stupid, and you have a list of reasons why; but you are not in charge. (I hope it is obvious why saying it is stupid if you believed it was not, and writing ten bad arguments to that effect, would be monumentally stupid.)
But Eliezer’s reason for excluding such talk is precisely that (in his view, and he is in charge) it is not stupid, but a real hazard, the gravity of which goes way beyond the supposed effect on the reputation of LessWrong. I say “supposed” because as far as I can see, it’s the clowns at RationalWiki who are trying to play this up for all it’s worth. Reminds me of The Register yapping at the heels of Steve Jobs. The recent links from Stross and Marginal Revolution have been via RW. Did they just happen to take notice at the same time, or is RW evangelising this?
The current deletion policy calls such things “toxic mindwaste”, which seems fair enough to me (and a concept that would be worth a Sequence-type posting of its own). I don’t doubt that there are many other basilisks, but none of them have appeared on LW. Ce qu’on ne voit pas, indeed.
It’s not that hard. DG is using ‘the Rational Wiki community’ for ‘we’, ‘your’ refers to ‘the LessWrong community’, and ‘distressed children’ presumably refers to Dmytry, XiXi and by now, probably some others.
No, “distressed children” refers to people upset by the basilisk who feel they can’t talk about it on LW so they email us, presumably as the only people on the Internet bothering to talk about LW. This was somewhat surprising.
I presume it would include things that David Gerard could not repeat here. After all that’s why the folk in question contacted people from the Rational Wiki community in the first place!
Actually, I may have just answered my own question by reading the RW page on the b*s*l*sk that three prominent blogs and a discussion forum recently all linked to. Does reading that calm them down?
The “So you’re worrying about the Basilisk” bit is a distillation of stuff that’s helped people and is specifically for that purpose. (e.g., the “Commit not to accept acausal blackmail” section strikes me as too in-universe, but XiXiDu says that idea’s actually been helpful to people who’ve come to him.) It could probably do with more. The probability discussion in the section above arguably belongs in it, but it’s still way too long.
so they email us, presumably as the only people on the Internet bothering to talk about LW.
Or more likely, because RW has been the only place you could actually learn about it in the first place (for the last two years at least). So, I really don’t think you have any reason to complain about getting those emails.
Haha, what is this offline you speak of? You’re correct that I didn’t think of that. However wouldn’t they then already have someone to talk to about this, and not need to email people on the internet?
Either way, my point still stands. If you co-author an article on any topic X and let that article be linked to a way of contacting you (by either email or PM), then you cannot complain about people contacting you regarding topic X.
(answered at greater length elsewhere, but) This is isomorphic to saying “describing what is morally reprehensible about the God of the Old Testament causes severe distress to some theists, so atheists shouldn’t talk about it either”. Sunlight disinfects.
I’d discuss the moral reprehensibility of God (in both the new and the old testament) if and only if I saw the estimated benefit in attempting to deconvert those people as outweighing the disutility of their potential distress.
If you see such benefits in telling the people of the basilisk, and are weighing them against the disutility of the potential distress caused by such information, and the benefits indeed outweigh the hazard, then fine.
Your essential theory seems to be that if someone shines a light on a pothole, then it’s their fault if people fall into it, not that of whoever dug it.
The strategy of attempting to keep it a secret has failed in every way it could possibly fail. It may be time to say “oops” and do something different.
Your essential theory seems to be that if someone shines a light on a pothole, then it’s their fault if people fall into it, not that of whoever dug it.
Or, for that matter, the fault of whoever forbade the construction of safety rails around it.
To me it’s unclear whether you believe:
a) that it’s bad to try to keep the basilisk because such attempt was doomed to failure, or
b) that it’s bad to try to keep it hidden because it’s always bad to keep any believed-to-be infohazard hidden, regardless of whether you’ll succeed or fail,
or
c) that it’s bad to try to keep this basilisk hidden, because it’s not a real infohazard, but it would be good to keep real infohazards hidden, which actually harm people you share them to.
Can you clarify to me which of (a), (b) or (c) you believe?
I didn’t claim my list was exhaustive. In particular, I was thinking of Dmytry and XiXiDu, both of whom are never far away from any discussion of LW and EY that takes place off-site. The better part of comments on the RW talk pages and Charles Stross’ blog concerning the basilisk are mostly copied and pasted from all their old remarks about the subject.
OK. What I heard in your earlier comment was that a wiki community was being held at fault for “opening their doors” to someone who criticized LW. Wikis are kind of known for opening their doors, and the skeptic community for being receptive to the literary genre of debunking.
That was a rather mind-killed comment.Wikis are suppoed to have open doors. RW is supposed to deal with pseudoscience, craziness and the pitfalls of religions. The Bsl*sk is easily all three.
How is merely stating it to be “mind-killed” supposed to change my mind?
You might care about that sort of thing, you might not. I don’ exactly have a complete knowledge of your psychology.
You’re misinformed.
That’s irrelevant. Wikis open thei doors to all contributors, and then eject those that don’t behave. That’s still an open door policy as opposed to invitation-only.
My comment wasn’t about whether or not RW should cover the Basilisk.
If it should cover the basilisk, why shouldn’t it have contributions from the “malcontents”.
If it should cover the basilisk, why shouldn’t it have contributions from the “malcontents”.
I didn’t make any such statement. Recall, DG was wondering where all this drama about the basilisk came from—I advised him that it came from two particular users, who are well-known for bringing up this drama in many other forums and have more-or-less dominated the RW talk pages on the subject.
At this point, there should be little doubt that the best response to this “basilisk” would have been “That’s stupid. Here are ten reasons why.”, rather than (paraphrasing for humor) “That’s getting erased from the internet. No, I haven’t heard the phrase ‘Streisand Effect’ before; why do you ask?”
The real irony is that Eliezer is now a fantastic example of the commitment/sunk cost effect which he has warned against repeatedly: having made an awful decision, and followed it up with further awful decisions over years (including at least 1 Discussion post deleted today and an expansion of topics banned on LW; incidentally, Eliezer, if you’re reading this, please stop marking ‘minor’ edits on the wiki which are obviously not minor), he is trapped into continuing his disastrous course of conduct and escalating his interventions or justifications.
And now the basilisk and the censorship are an established part of the LW or MIRI histories which no critic could possibly miss, and which pattern-matches on religion. (Stross claims that it indicates that we’re “Calvinist”, which is pretty hilarious for anyone who hasn’t drained the term of substantive meaning and turned it into a buzzword for people they don’t like.) A pity.
While we’re on the topic, I also blame Yvain to some extent; if he had taken my suggestion to add a basilisk question to the past LW survey, it would be much easier to go around to all the places discussing it and say something like ‘this is solely Eliezer’s problem; 98% disagree with censoring it’. But he didn’t, and so just as I predicted, we have lost a powerful method of damage control.
It sucks being Cassandra.
Let me consult my own crystal ball… Yes, the mists of time are parting. I see… I see… I see, a few years from now, a TED panel discussion on “Applied Theology”, chaired by Vernor Vinge, in which Eliezer, Roko, and Will Newsome discuss the pros and cons of life in an acausal multiverse of feuding superintelligences.
The spirits have spoken!
I’m looking forward to that.
Gwern, I made a major Wiki edit followed by a minor edit. I wasn’t aware that the latter would mask the former.
When you’re looking at consolidated diffs, it does. Double-checking, your last edit was marked minor, so I guess there was nothing you could’ve done there.
(It is good wiki editing practice to always make the minor or uncontroversial edits first, so that way your later edits can be looked at without the additional clutter of the minor edits or they can be reverted with minimal collateral damage, but that’s not especially relevant in this case.)
That’s already true without the basilisk and censorship. The similarities between transhumanism and religion have been remarked on for about as long as transhumanism has been a thing.
An additional item to pattern-match onto religion, perhaps I should have said.
Also, note that this wasn’t an unsolicited suggestion: in the post to which gwern’s comment was posted, Yvan actually said that he was “willing to include any question you want in the Super Extra Bonus Questions section [of the survey], as long as it is not offensive, super-long-and-involved, or really dumb.” And those are Yvain’s italics.
At this point it is this annoying, toxic meta discussion that is the problem.
Then EY would have freaked the hell out, and I don’t know what the consequences of that would be but I don’t think they would be good. Also, I think the basilisk question would have had lots of mutual information with the troll toll question anyway: [pollid:419]
EDIT: I guess I was wrong.
It’s too late. This poll is in the wrong place (attracting only those interested in it), will get too few responses (certainly not >1000), and is now obviously in reaction to much more major coverage than before so the responses are contaminated.
Actually, I was hoping to find some strong correlation between support for the troll toll and support for the basilisk censorship so that I could use the number of people who would have supported the censorship from the answers to the toll question in the survey. But it turns out that the fraction of censorship supporters is about 30% both among toll supporters and among toll opposers. (But the respondents to my poll are unlikely to be an unbiased sample of all LWers.)
The ‘troll toll’ question misses most of the significant issue (as far as I’m concerned). I support the troll toll but have nothing but contempt for Eliezer’s behavior, comments, reasoning and signalling while implementing the troll toll. And in my judgement most of the mutual information with the censorship or Roko’s Basilisk is about those issues (things like overconfidence, and various biases of the kind Gwern describes) is to do with the judgement of competence based on that behavior rather than the technical change to the lesswrong software.
Just to be charitable to Eliezer, let me remind you of this quote. For example, can you conceive of a reason (not necessarily the officially stated one) that the actual basilisk discussion ought to be suppressed, even at the cost of the damage done to LW credibility (such as it is) by an offsite discussion of such suppression?
I thought this is more akin to Scientology, where any mention of Xenu to the uninitiated ought to be suppressed.
Sure does. Then again, it probably sucks more being Laocoön.
The basilisk is harmless. Eliezer knows this. The streisand effect was the intended consequence of the censor. The hope is that people who become aware of the basilisk will increase their priors for the existence of real information hazards, and will in the future be less likely to read anything marked as such. It’s all a clever memetic inoculation program!
disclaimer : I don’t actually believe this.
Another possibility: Eliezer doesn’t object to the meme that anyone who doesn’t donate to SIAI/MIRI will spend eternity in hell being spread in a deniable way.
Why stop there? In fact, Roko was one of Eliezer’s many socks puppets. It’s your basic Ender’s Game stuff.
We are actually all Eliezer’s sock puppets. Most of us unfortunately are straw men.
We are the hollow men / we are the stuffed men / Leaning together / Headpiece filled with straw. Alas! / Our dried comments when / we discuss together / Are quiet and meaningless / As median-cited papers / or reports of supplements / on the Internet.
Another possibility: Eliezer does not want the meme to be associated with LW. Because, even if it was written by someone else, most people are predictably likely to read it and remember: “This is an idea I read on LW, so this must be what they believe.”
It’s certainly an inoculation for information hazards. Or at least against believing information hazard warnings.
Alternatively, the people dismissing the idea out of hand are not taking it seriously and thus not triggering the information hazard.
Also the censorship of the basilisk was by no means the most troubling part of the Roko incident, and as long as people focus on that they’re not focusing on the more disturbing issues.
Edit: The most troubling part were some comments, also deleted, indicating just how fanatically loyal some of Eliezer’s followers are.
Really? Or do you just want us to believe that you don’t believe this???
No. I have watched Eliezer make this unforced error now for years, sliding into an obvious and common failure mode, with mounting evidence that censorship is, was, and will be a bad idea, and I have still not seen any remotely plausible explanation for why it’s worthwhile.
Just to take this most recent Stross post: he has similar traffic to me as far as I can tell, which means that since I get ~4000 unique visitors a day, he gets as many and often many more. A good chunk will be to his latest blog post, and it will go on being visited for years on end. If it hits the front page of Hacker News as more than a few of his blog posts do, it will quickly spike to 20k+ uniques in just a day or two. (In this case, it didn’t.) So we are talking, over the next year, easily 100,000 people being exposed to this presentation of the basilisk (just need average 274 uniques a day). 100k people being exposed to something which will strike them as patent nonsense, from a trusted source like Stross.
So maybe there used to be some sort of justification behind the sunk costs and obtinacy and courting of the Streisand effect. Does this justification also justify trashing LW/MIRI’s reputation among literally hundreds of thousands of people?
You may have a witty quote, which is swell, but I’m afraid it doesn’t help me see what justification there could be.
Laocoön died quickly and relatively cleanly by serpent; Cassandra saw all her predictions (not just one) come true, was raped, abducted, kept as a concubine, and then murdered.
Can you please stop with this meta discussion?
I banned the last discussion post on the Basilisk, not Eliezer. I’ll let this one stand for now as you’ve put some effort into this post. However, I believe that these meta discussions are as annoyingly toxic as anything at all on Less Wrong. You are not doing yourself or anyone else any favors by continuing to ride this.
The reputational damage to Less Wrong has been done. Is there really anything to be gained by flipping moderation policy?
At this point, let’s not taunt people with the right kind of mental pathology to be made very uncomfortable by the basilisk or meta-set of basilisks.
There’s now the impression that a community of aspiring rationalists — or, at least, its de-facto leaders — are experiencing an ongoing lack of clue on the subject of the efficacy of censorship on online PR.
The “reputational damage” is not just “Eliezer or LW have this kooky idea.”
It is ”… and they think there is something to be gained by shutting down discussion of this kooky idea, when others’ experience (Streisand Effect, DeCSS, etc.) and their own (this very thread) are strong evidence to the contrary.”
It is the apparent failure to update — or to engage with widely-recognized reality at all — that is the larger reputational damage.
It is, for that matter, the apparent failure to realize that saying “Don’t talk about this because it is bad PR” is itself horrible PR.
The idea that LW or its leadership dedicate nontrivial attention to encircling and defending against this kooky idea makes it appear that the idea is central to LW. Some folks on the thread on Stross’s forum seem to think that Roko discovered the hidden secret motivating MIRI! That’s bogus … but there’s a whole trope of “cults” suppressing knowledge of their secret teachings; someone who’s pattern-matched LW or transhumanism onto “cult” will predictably jump right there.
My own take on the whole subject is that basilisk-fear is a humongous case of privileging the hypothesis coupled to an anxiety loop. But … I’m rather prone to anxiety loops myself, albeit over matters a little more personal and less abstract. The reason not to poke people with Roko’s basilisk is that doing so a form of aggression — it makes (some) people unhappy.
But as far as I can tell, it’s no worse in that regard than a typical Iain M. Banks novel, or some of Stross’s own ideas for that matter … which are considered entertainment. Which means … humans eat “basilisks” like this for dessert. In one of Banks’s novels, multiple galactic civilizations invent uploading, and use it to implement their religions’ visions of Hell, to punish the dead and provide an incentive to the living to conform to moral standards.
(But then, I read Stross and Banks. I don’t watch gore-filled horror movies, though, and I would consider someone forcing me to watch such a movie would be doing aggression against me. So I empathize with those who are actually distressed by the basilisk idea, or the “basilisk” idea for that matter.)
I have to say, I find myself feeling worse for Eliezer than for anyone else in this whole affair. Whatever else may be going on here, having one’s work cruelly mischaracterized and held up to ridicule is a whole bunch of no fun.
Thank you for appreciating this. I expected it before I got started on my life, I’m already accustomed to it by now, I’m sure it doesn’t compare to the pain of starving to death. Since I’m not in any real trouble, I don’t intend to angst about it.
Glad to hear it.
The basilisk is now being linked on Marginal Revolution. Estimated site traffic: >3x gwern.net; per above, that is >16k uniques daily to the site.
What site will be next?
More importantly, will endless meta-discussions like this make another site more likely or less likely to link it?
Will an abandonment of a disastrous policy be more or less disastrous? Well, when I put it that way, it suddenly seems obvious.
Less disastrous as in “people spending less time criticizing Eliezer’s moderating skills”? Probably yes.
Less disastrous as in “people spending less time on LW discussing the ‘basilisk’”? Probably no. I would expect at least dozen articles about this topic within the first year if the ban would be completely removed.
Less disastrous as in “people less likely to create more ‘basilisk’-style comments”? Probably no. Seems that the policy prevented this successfully.
Answering the rhetorical question because the obvious answer is not what you imply [EDIT: I notice that J Taylor has made a far superior reply already]: Yes, it limits the ongoing reputational damage.
I’m not arguing with the moderation policy. But I will argue with bad arguments. Continue to implement the policy. You have the authority to do so, Eliezer has the power on this particular website to grant that authority, most people don’t care enough to argue against that behavior (I certainly don’t) and you can always delete the objections with only minimal consequences. But once you choose to make arguments that appeal to reason rather than the preferences of the person with legal power then you can be wrong.
I’ve had people come to me who are traumatised by basilisk considerations. From what I can tell almost all of the trauma is attributable to Eliezer’s behavior. The descriptions of the experience give clear indications (ie. direct self reports that are coherent) that a significant reason that they “take the basilisk seriously” is because Eliezer considers it a sufficiently big deal that he takes such drastic and emotional action. Heck, without Eliezer’s response it wouldn’t even have earned that title. It’d be a trivial backwater game theory question to which there are multiple practical answers.
So please, just go back to deleting basilisk talk. That would be way less harmful than trying to persuade people with reason.
I get the people who’ve been frightened by it because EY seems to take it seriously too. (Dmytry also gets them, which is part of why he’s so perpetually pissed off at LW. He does his best to help, as a decent person would.) More generally, people distressed by it feel they can’t talk about it on LW, so they come to RW contributors—addressing this was why it was made a separate article. (I have no idea why Warren Ellis then Charlie Stross happened to latch onto it—I wish they hadn’t, because it was totally not ready, so I had to spend the past few days desperately fixing it up, and it’s still terrible.) EY not in fact thinking it’s feasible or important is a point I need to address in the last section of the RW article, to calm this concern.
It would be nice if you’d also address the extent to which it misrepresents other LessWrong contributors as thinking it is feasible or important (sometimes to the point of mocking them based on its own misrepresentation). People around LessWrong engage in hypothetical what-if discussions a lot; it doesn’t mean that they’re seriously concerned.
Lines like “Though it must be noted that LessWrong does not believe in or advocate the basilisk … just in almost all of the pieces that add up to it.” are also pretty terrible given we know only a fairly small percentage of “LessWrong” as a whole even consider unfriendly AI to be the biggest current existential risk. Really, this kind of misrepresentation of alleged, dubiously actually held extreme views as the perspective of the entire community is the bigger problem with both the LessWrong article and this one.
The article is still terrible, but it’s better than it was when Stross linked it. The greatest difficulty is describing the thing and the fuss accurately while explaining it to normal intelligent people without them pattern matching it to “serve the AI God or go to Hell”. This is proving the very hardest part. (Let’s assume for a moment 0% of them will sit down with 500K words of sequences.) I’m trying to leave it for a bit, having other things to do.
As far as I can tell the entire POINT of LW is to talk about various mental pathologies and how to avoid them or understand them even if they make you very uncomfortable to deal with or acknowledge. The reasons behind talking about the basilisk or basilisks in general (apart from metashit about censorship) are just like the reasons for talking about trolley problems even if they make people angry or unhappy. What do you do when your moral intuitions seem to break down? What do you do about compartmentalization or the lack of it? Do you bite bullets? Maybe the mother should be allowed to buy acid.
To get back to meta shit: If people are complaining about the censorship and you are sick of the complaints, the simplest way to stop them is to stop the censorship. If someone tells you there’s a problem, the response of “Quit your bitching, it’s annoying” is rarely appropriate or even reasonable. Being annoying is the point of even lameass activism like this. I personally think any discussion of the actual basilisk has reached every conclusion it’s ever really going to reach by now, pretty reasonably demonstrated by looking at the uncensored thread, and the only thing even keeping it in anyone’s consciousness is the continued ballyhooing about memetic hazards.
yawn
I am appalled that you believe this response was remotely appropriate or superior to saying nothing at all. How is it not obvious that once you have publicly put on your hat as an authority you take a modicum of care to make sure you don’t behave like a contemptuous ass?
The meta discussions will continue until morale improves
I hate to use silly symmetrical rhetoric, however:
I wouldn’t call him that, and not because I have any doubt about his trustworthiness. It’s the other word, “source”, that I wouldn’t apply. He’s a professional SF author. His business is to entertain with ideas, and his blog is part of that. I wouldn’t go there in search of serious analysis of anything, any more than I would look for that on RationalWiki. Both the article in question and the comments on it are pretty much on a par with RationalWiki’s approach. In fact (ungrounded speculation alert), I have to wonder how many of the commenters there are RW regulars, there to fan the flame.
Stross is widely read, cited, and quoted approvingly, on his blog and off (eg. Hacker News). He is a trusted source for many geeks.
RationalWiki’s new coat-of-arms is a troll riding a basilisk.
What if he CAN’T conceive a reason? Can you conceive a possibility that it might be for other reason than Gwern being less intelligent then EY? For example, Gwern might be more intelligent than EY.
Did you happen to catch the deleted post? Was there any interesting reasoning contained therein? If so, who was the author and did they keep a backup that they would be willing to email me? (If they did not keep a backup… that was overwhelmingly shortsighted unless they are completely unfamiliar with the social context!)
I saw it. I contained just a link and the a line asking for “thoughts” or words to that effect. Maybe there was a quote—certainly nothing new or origional.
Thanks. I’ve been sent links to all the recently deleted content and can confirm that nothing groundbreaking was lost.
I’m not convinced the Streisand Effect is actually real. It seems like an instance of survival bias. After all, you shouldn’t expect to hear about the cases when information was successfully suppressed.
This is a bizarre position to take. The effect does not constitute a claim that all else being equal attempts to suppress information are negatively successful. Instead it describes those cases where information is published more widely due to the suppression attempt. This clearly happens sometimes. The Wikipedia article gives plenty of unambiguous examples.
It would be absurd to believe that the number in question would have been made into T-shirts, tattoos and a popular YouTube song no attempt was made to suppress it. That doesn’t mean (or require) that in other cases (and particularly in other cases where the technological and social environment was completely different) that sometimes powerful figures are successful in suppressing information.
Heck, there is little doubt that even your paraphrased humorous alternative would have been much better than what actually happened. It’s not often that satirical caricatures are actually better than what they are based on!
That would only be the best response if the basilisk were indeed stupid, and there were indeed ten good reasons why. Presumably you do think it is stupid, and you have a list of reasons why; but you are not in charge. (I hope it is obvious why saying it is stupid if you believed it was not, and writing ten bad arguments to that effect, would be monumentally stupid.)
But Eliezer’s reason for excluding such talk is precisely that (in his view, and he is in charge) it is not stupid, but a real hazard, the gravity of which goes way beyond the supposed effect on the reputation of LessWrong. I say “supposed” because as far as I can see, it’s the clowns at RationalWiki who are trying to play this up for all it’s worth. Reminds me of The Register yapping at the heels of Steve Jobs. The recent links from Stross and Marginal Revolution have been via RW. Did they just happen to take notice at the same time, or is RW evangelising this?
The current deletion policy calls such things “toxic mindwaste”, which seems fair enough to me (and a concept that would be worth a Sequence-type posting of its own). I don’t doubt that there are many other basilisks, but none of them have appeared on LW. Ce qu’on ne voit pas, indeed.
RW didn’t push this at all. I have no idea why Warren Ellis latched onto it, though I expect that’s where Charlie Stross picked it up from.
The reason the RW article exists is because we’re getting the emails from your distressed children.
I can’t parse this. Who are “we”, “you”, and the “distressed children”? I don’t think I have any, even metaphorically.
It’s not that hard. DG is using ‘the Rational Wiki community’ for ‘we’, ‘your’ refers to ‘the LessWrong community’, and ‘distressed children’ presumably refers to Dmytry, XiXi and by now, probably some others.
No, “distressed children” refers to people upset by the basilisk who feel they can’t talk about it on LW so they email us, presumably as the only people on the Internet bothering to talk about LW. This was somewhat surprising.
Well then, that’s the reputation problem solved. If it’s only RationalWiki...
What do you tell them?
I presume it would include things that David Gerard could not repeat here. After all that’s why the folk in question contacted people from the Rational Wiki community in the first place!
Actually, I may have just answered my own question by reading the RW page on the b*s*l*sk that three prominent blogs and a discussion forum recently all linked to. Does reading that calm them down?
The “So you’re worrying about the Basilisk” bit is a distillation of stuff that’s helped people and is specifically for that purpose. (e.g., the “Commit not to accept acausal blackmail” section strikes me as too in-universe, but XiXiDu says that idea’s actually been helpful to people who’ve come to him.) It could probably do with more. The probability discussion in the section above arguably belongs in it, but it’s still way too long.
Or more likely, because RW has been the only place you could actually learn about it in the first place (for the last two years at least). So, I really don’t think you have any reason to complain about getting those emails.
That’s not strictly true; for instance, it may be discussed offline!
Haha, what is this offline you speak of? You’re correct that I didn’t think of that. However wouldn’t they then already have someone to talk to about this, and not need to email people on the internet?
Either way, my point still stands. If you co-author an article on any topic X and let that article be linked to a way of contacting you (by either email or PM), then you cannot complain about people contacting you regarding topic X.
Isn’t it on RW that these people read the basilisk in the first place?
(answered at greater length elsewhere, but) This is isomorphic to saying “describing what is morally reprehensible about the God of the Old Testament causes severe distress to some theists, so atheists shouldn’t talk about it either”. Sunlight disinfects.
I’d discuss the moral reprehensibility of God (in both the new and the old testament) if and only if I saw the estimated benefit in attempting to deconvert those people as outweighing the disutility of their potential distress.
If you see such benefits in telling the people of the basilisk, and are weighing them against the disutility of the potential distress caused by such information, and the benefits indeed outweigh the hazard, then fine.
Your essential theory seems to be that if someone shines a light on a pothole, then it’s their fault if people fall into it, not that of whoever dug it.
The strategy of attempting to keep it a secret has failed in every way it could possibly fail. It may be time to say “oops” and do something different.
Or, for that matter, the fault of whoever forbade the construction of safety rails around it.
To me it’s unclear whether you believe: a) that it’s bad to try to keep the basilisk because such attempt was doomed to failure,
or b) that it’s bad to try to keep it hidden because it’s always bad to keep any believed-to-be infohazard hidden, regardless of whether you’ll succeed or fail, or c) that it’s bad to try to keep this basilisk hidden, because it’s not a real infohazard, but it would be good to keep real infohazards hidden, which actually harm people you share them to.
Can you clarify to me which of (a), (b) or (c) you believe?
Yes, RW was just the forum that willingly opened their doors to various anti-LW malcontents, who are themselves pushing this for all it’s worth.
That’s overly specific. Mostly they’re folks who like to snicker at weird ideas — most of which I snicker at, too.
I didn’t claim my list was exhaustive. In particular, I was thinking of Dmytry and XiXiDu, both of whom are never far away from any discussion of LW and EY that takes place off-site. The better part of comments on the RW talk pages and Charles Stross’ blog concerning the basilisk are mostly copied and pasted from all their old remarks about the subject.
OK. What I heard in your earlier comment was that a wiki community was being held at fault for “opening their doors” to someone who criticized LW. Wikis are kind of known for opening their doors, and the skeptic community for being receptive to the literary genre of debunking.
That was a rather mind-killed comment.Wikis are suppoed to have open doors. RW is supposed to deal with pseudoscience, craziness and the pitfalls of religions. The Bsl*sk is easily all three.
In what way? How is merely stating it to be “mind-killed” supposed to change my mind?
You’re misinformed.
My comment wasn’t about whether or not RW should cover the Basilisk.
You might care about that sort of thing, you might not. I don’ exactly have a complete knowledge of your psychology.
That’s irrelevant. Wikis open thei doors to all contributors, and then eject those that don’t behave. That’s still an open door policy as opposed to invitation-only.
If it should cover the basilisk, why shouldn’t it have contributions from the “malcontents”.
I didn’t make any such statement. Recall, DG was wondering where all this drama about the basilisk came from—I advised him that it came from two particular users, who are well-known for bringing up this drama in many other forums and have more-or-less dominated the RW talk pages on the subject.