Roko’s Basilisk legitimately demonstrates a problem with LW. “Rationality” that leads people to believe such absurd ideas is messed up, and 1) the presence of a significant number of people psychologically affected by the basilisk and 2) the fact that Eliezer accepts that basilisk-like ideas can be dangerous are signs that there is something wrong with the rationality practiced here.
“Rationality” leads people to believe such absurd ideas
Are you sure you have pinpointed the right culprit? Why exactly “rationality”? “Zooming in” and “zooming out” would lead to potentially different conclusions. E.g. G.K.Chesterton would probably blame atheism[1]. Zooming out even more, for example, someone immersed in Eastern thought might even blame Western thought in general.
Despite receiving vastly disproportionate share of media attention it was such a small part of LessWrong history and thought (by the way, is anything that any LWer ever came up with a part of LW thought?) that it seems to wrong to put the blame on LessWrong or rationality in general.
Furthermore, which would you say is better, an ability to formulate an absurd idea and then find its flaws (or, for e.g. mathematical ideas, exactly under what strange conditions they hold) or inability to formulate absurd ideas at all? Ability to come up with various absurd ideas is an unavoidable side effect of having an imagination. What is important is not to start believing it immediately, because in the history of any really new and outlandish idea at the very beginning there is an important asymmetry (which arises due to the fact that coming up with any complicated idea takes time) - an idea itself has already been invented but the good counterarguments do not yet exist (this is similar to the situation where a new species is introduced to an island where it does not have natural predators, which are introduced only later). This also applies to the moment when a new outlandish idea is introduced to your mind and you haven’t heard any counterarguments by that moment, one must nevertheless exercise caution. Especially if that new idea is elegant and thought provoking whereas all counterarguments are comparatively ugly and complicated and thus might feel unsatisfactory even after you have heard them.
the presence of a significant number of people psychologically affected
Was there really a significant number of people or is this just, well, an urban legend? The fact that some people are affected is not particularly surprising—it seems to be consistent with the existence of e.g. OCD. Again, one must remember that not everyone thinks the same way and the common thing between people affected might have been something other than acquaintance with LW and rationality which you seem to imply (correct me if my impression was wrong).
the fact that Eliezer accepts that basilisk-like ideas can be dangerous
I think it is better to give Eliezer a chance to explain himself why he did what he did. My understanding is that whenever someone introduces a person to new variant of this concept without explaining proper counterarguments it takes time for that person to acquaint themselves with them. In very specific instances that might lead to unnecessary worrying about it and potentially even some actions (most people would regard this idea as too outlandish and too weird whether or not it was correct and compartmentalize everything even if it was). A clever devil’s advocate could potentially come up with more and more elaborate versions of this idea which take more and more time to take down. As you can see, it is not necessary for any form of this idea to be correct for this gap to expand.
Personally I understand (and share) the appeal of various interesting speculative ideas and and the frustration that someone thinks that this is supposedly bad for some people, which seems against my instincts and the highly valuable norm of free marketplace of ideas.
At this point in time, however, the basilisk seems to be more often brought up in order to dismiss all LW, rather than only this specific idea, thus it is no wonder that many people get defensive even if they do not believe it.
All of this does not touch the question whether the whole situation was handled the way it should have been handled.
[1] Although the source says that famous quote is misattributed. Huh. I remember reading a similar idea in one of “Father Brown” short stories. I’ll have to check it.
(excuse my english, feel free to correct mistakes)
Are you sure you have pinpointed the right culprit? Why exactly “rationality”? “Zooming in” and “zooming out” would lead to potentially different conclusions.
The quotes indicate that I’m not blaming rationality, I’m blaming something that’s called rationality. You’re replying as if I’m blaming real rationality, which I’m not.
Was there really a significant number of people or is this just, well, an urban legend?
Censoring substantial references to the basilisk was partly done in the name of protecting the people affected. This requires that there be a significant number of people, not just that there be the normal number of people who can be affected by any unusual idea.
I think it is better to give Eliezer a chance to explain himself why he did what he did.
His explanations have varied. The explanation you linked to is fairly innocuous; it implies that he is only banning discussion because people get harmed when thinking about it. Someone else linked a screengrab of Eliezer’s original comment which implies that he banned it because it can make it easier for superintelligences to acausally blackmail us, which is very different from the one you linked.
Censoring substantial references to the basilisk was partly done in the name of protecting the people affected. This requires that there be a significant number of people, not just that there be the normal number of people who can be affected by any unusual idea.
Curiously, it is not necessary. For example, it would suffice that people who do the censoring overestimate the number of people that might need protection. Or consider PR explanation that I gave in another comment which similarly does not require a large number of people affected. Some other parts of your comment are also addressed there.
It is certainly possible that few people were affected by the Basilisk, and the people who do the censoring either overestimate the number or are just using it as an excuse. But this reflects badly on LW all by itself, and also amounts to “you cannot trust the people who do the censoring”, a position which is at least as unpopular as my initial one.
Are you sure you have pinpointed the right culprit? Why exactly “rationality”? “Zooming in” and “zooming out” would lead to potentially different conclusions. E.g. G.K.Chesterton would probably blame atheism[1]. Zooming out even more, for example, someone immersed in Eastern thought might even blame Western thought in general.
It’s whatever makes LW different from the wider population, even the wider nerdy-western-liberal-college-educated cluster. The general population of atheists does not have problems with basilisks, and laughs them off when you describe them to them.
Despite receiving vastly disproportionate share of media attention it was such a small part of LessWrong history and thought (by the way, is anything that any LWer ever came up with a part of LW thought?) that it seems to wrong to put the blame on LessWrong or rationality in general.
It also received a disproportionate amount of ex cathedra moderator action. Which things are so important to EY that he feels it necessary to intervene directly and in a massively controversial way? By their actions we can conclude that the Basilisk is much more important to the LW leadership than e.g. the illegitimate downvoting that drove danerys away.
Furthermore, which would you say is better, an ability to formulate an absurd idea and then find its flaws (or, for e.g. mathematical ideas, exactly under what strange conditions they hold) or inability to formulate an absurd ideas at all? Ability to come up with various absurd ideas is an unavoidable side effect of having imagination. What is important is not to start believing it immediately, because in the history of any really new and outlandish idea at the very beginning there is an important asymmetry (which arises due to the fact that coming up with any complicated idea takes time) - an idea itself has already been invented but the good counterarguments do not yet exist (this is similar to the situation where a new species is introduced to an island where it does not have natural predators, which are introduced only later). This also applies to the moment when a new outlandish idea is introduced to your mind and you haven’t heard any counterarguments by that moment, one must nevertheless exercise caution. Especially if that new idea is elegant and thought provoking whereas all counterarguments are comparatively ugly and complicated and thus might feel unsatisfactory even after you heard them.
I don’t think this addresses the original argument. If these ideas are dangerous to us then we are doing something wrong. If you’re saying that danger is an unavoidable cost of being able to generate interesting ideas, then the large number of other groups who seem to come up with interesting ideas without ideas that present a danger to them seems like a counterexample.
Was there really a significant number of people or is this just, well, an urban legend? The fact that some people are affected is not particularly surprising—it seems to be consistent with the existence of e.g. OCD. Again, one must remember that not everyone thinks the same way and the common thing between people affected might have been something other than acquaintance with LW and rationality which you seem to imply (correct me if my impression was wrong).
I don’t know, but the LW leadership’s statements seem to be grounded in the claim that there were
Which things are so important to EY that he feels it necessary to intervene directly and in a massively controversial way? By their actions we can conclude that the Basilisk is much more important to the LW leadership than e.g. the illegitimate downvoting that drove danerys away.
At the time the Basilisk episode happened Eliezer was a lot more active in general then when the illegitimate downvoting happened.
If you’re saying that danger is an unavoidable cost of being able to generate interesting ideas, then the large number of other groups who seem to come up with interesting ideas without ideas that present a danger to them seems like a counterexample.
If you look at the self professed skeptic community there are episodes such as elevator gate.
If you go a bit further back and look at what Stalin did, I would call the ideas on which he acted dangerous.
The general population of atheists does not have problems with basilisks, and laughs them off when you describe them to them.
It’s pretty easy to speak about a lot of topics in a way that the people you are talking to laugh and don’t take the idea seriously.
A bunch of that atheist population also treats their new atheism like a religion and closes itself from alternative ideas that sound weird. For practical purposes they are religious and do have a fence against taking new ideas seriously.
It’s whatever makes LW different from the wider population, even the wider nerdy-western-liberal-college-educated cluster. The general population of atheists does not have problems with basilisks, and laughs them off when you describe them to them.
What ideas does the general population of atheists have in common besides the lack of belief in God? And what interesting ideas can you derive from that? F.Dostoevsky (who wasn’t even an atheist) seems to have thought that from this one could derive that everything is morally permitted. Maybe some atheistic ideas seemed new, interesting and outlandish in the past when there were few atheists (e.g. separation of church and state), but nowadays they are part of common sense.
No, the claim of this hypothetical Chesterton would not be that atheism creates new weird ideas. It would be that by rejecting god you lose the defense against various weird ideas (“It’s the first effect of not believing in God that you lose your common sense.”—G.K.Chesterton). It is not general atheism, it is specific atheist groups. And in the history of the world, there were a lot of atheists who believed in strange things. E.g. some atheists believe in reincarnation or spiritism. Some believe that the Earth is a zoo kept by aliens. In previous times, some revolutionaries (led not by their atheism, but by other ideologies) believed that just because the social order is not god given it could be easily changed into basically anything. The hypothetical Chesterton would probably claim that had all these people closely followed church’s teachings they would not have believed in these follies since the common sense provided by the traditional christianity would have prevented them. And he would probably be right. The hypthetical Chesterton would probably think that the basilisk is yet another thing in the long list of things some atheists stupidly believe.
Yes, on LessWrong the weirdness heuristic is used less than in more general atheist/skeptic community (in my previous post I have already mentioned why I think it is often useful), and it is considered bad to dismiss the idea if the only counterargument to it is that is weird. Difference in acceptance of weirdness heuristic probably comes from different mentalities: trying to become more rational vs. a more conservative strategy of trying to avoid being wrong (e.g. accepting epistemic learned helplessness when faced with weird and complicated arguments and defaulting to the mainstream position). This difference may reduce a person’s defenses against various new and strange ideas. But even then, one of the most upvoted LW posts of all times already talks about this danger.
Nevertheless, while you claim that general population of atheists “laughs them off when you describe them to them.”, it is my impression that the same is true here, on LessWrong, as absolute majority of LWers do not consider it as a serious thing (sadly, I do not recall any survey asking about that). It is just a small proportion of LWers that believe in this idea. Thus it cannot be “whatever makes LW different from the wider population”, it must be whatever makes that small group different from the wider LW population, because even after rejecting following tradition (which would be the hypothetical Chesterton’s explanation) and diminished usage of weirdness heuristic (which would be average skeptic’s explanation) majority of LWers still do not believe it. And the reasons why some LWers become defensive when someone brings it up are probably very similar to those described in a blog post “Weak Men Are Superweapons” by Yvain.
One could argue that LessWrong thought made it possible to formulate such an idea. Which I had already addressed in my previous post. Once you have a wide vocabulary of ideas you can come up with many things. It is important to be able to find out if the thing you came up with is true.
If these ideas are dangerous to us then we are doing something wrong. If you’re saying that danger is an unavoidable cost of being able to generate interesting ideas, then the large number of other groups who seem to come up with interesting ideas without ideas that present a danger to them seems like a counterexample.
I do not think that thinking about basilisk is dangerous to us. Maybe it is to some people with OCD or something similar, I do not know. I talked about absurdity, not danger. It seems to me that instead of restricting our imagination (so as to avoid coming up with absurd things), we should let it run free and try improve our ability to recognize which of these imagined ideas are actually true.
It also received a disproportionate amount of ex cathedra moderator action. Which things are so important to EY that he feels it necessary to intervene directly and in a massively controversial way? By their actions we can conclude that the Basilisk is much more important to the LW leadership than e.g. the illegitimate downvoting that drove danerys away.
I do not know what exactly did Eliezer think when he decided. I am not him. In fact, I wasn’t even there when it happened. I have no way of knowing whether he had actually had a clear reason at the time or simply freaked out and made an impulsive decision, or actually believed it at that moment (at least to the extent of being unable to immediately rule it out completely, which might have led to censor that post in order to postpone the argument). However, I have an idea which I find at least somewhat plausible. This is a guess.
Suppose even a very small number of people (let’s say 2-4 people) were affected (again, let’s remember that they would be very atypical, I doubt that having, e.g. OCD would be enough) in a way that instead of only worrying about this idea, they would actually take action and e.g. sell the large part of their possessions and donate it to (what was then known as) SIAI, leave their jobs to work on FAI or start donating all their income (neglecting their families) out of fear of this hypothetical torture. Now that would be a PR disaster many orders of magnitude larger than anything basilisk related we have now. Now, when people use the word “cult”, they seem to seem to use it figuratively, as a hyperbole (e.g.), in that case people and organizations who monitor real cults would actually label SIAI as a literal one (whether SIAI likes it or not). Now that would be a disaster both for SIAI and the whole friendly AI project, possibly burying it forever. Considering that Eliezer worried about such things even before this whole debacle, it must have crossed his mind and this possibility must have looked very scary leading to the impulsive decision and what we can now see as improper handling of the situation.
Then why not claim that you do this for PR reasons instead of caring about psychological harm of those people? Firstly, one may actually care about those people, especially if one knows one of them personally (which seems to be the case from the screenshot provided by XiXiDu and linked by Jiro). And even in more general case, talking about caring usually looks better than talking about PR. Secondly, “stop it, it is for your own safety” probably stops more people from looking than “stop it, it might give us a bad PR” (as we can see from the recent media attention, the second reason stops basically nobody). Thirdly, even if Eliezer personally met all those people (once again, remember that they would be very atypical) affected and explicitly asked not to do anything, they would understand that he has SIAI PR at stake and thus an incentive to lie to them about what they should do, and they wouldn’t want to listen to him (as even remote possibility of torture might seem scary) and, e.g. donate via another person. Or find their own ways of trying to create fAI. Or whatever they can come up with. Or find their ways to fight the possible creation of AI. Or maybe even something like this. I don’t know, this idea did not cause me nightmares therefore I do not claim to understand the mindset of those people. Here I must note that in no way I am claiming that because a person has an OCD they would actually do that.
Nowadays, however, what most people seem to want to talk about is not the idea of a basilisk itself, but rather the drama surrounding it. As it is sometimes used to dismiss all LW (again, for reasons similar to this), many people get very defensive and pattern match those who bring this topic up with an intent of potentially discussing it (and related events) to trolls who do it just for the sake of trolling.
Therefore this situation might feel worse for some people, especially those who are not targeted by the mass downvoting or have so much karma they can’t realistically be significantly affected by it.
I feel like I am putting a lot of effort to steelman everything. I guess I, too, got very defensive, potentially for the reasons mentioned in that SlateStarCodex post. Well, maybe everything was just a combination of many stupid decisions, impulsive behaviour and after-the-fact rationalizations, which, after all, might be the simplest explanation. I don’t know. Well, as I wasn’t even there, there must people who would be better informed about the events and better suited to argue.
Then why not claim that you do this for PR reasons instead of caring about psychological harm of those people? Firstly, one may actually care about those people, especially if one knows one of them personally (which seems to be the case from the screenshot provided by XiXiDu and linked by Jiro).
XiXiDu’s screenshot is damning because it indicates that Eliezer banned the Basilisk because he thought a variation on it might work, not because of either PR reasons or psychological harm.
Unless you think he was lying about that for the same reason he might want to lie about psychological harm.
Well, in that post by Xixidu, there is a quote by Mitchell Porter (that is approved by Eliezer) that, combined with the [reddit post] I have linked earlier, seems he was not able to provide a proof that no variation of basilisk would ever work given that there are more than one possible decision theory, including some exotic and obscure ones that are not yet invented (but who knows what will be invented in the future). Eliezer seems to think that humans minds are unable to actually rigorously follow such a decision theory strictly enough that would be required for such a concept to work. But the human ability is such a vague concept, it is not clear how one can give a formal proof.
However, it seems to me that an inability to provide a formal proof seems to be an unlikely reason to freak out.
What (I guess) has happened was that this inability to provide a proof, combined with that unnamed SIAI person’s nightmares (I would guess that Eliezer knows all SIAI people personally) and the fear of the aforementioned potential PR disaster might have resulted into the feeling of losing control of a situation and made him panic, thus resulting into that nervous and angry post, emphasizing the danger and need to protect some people (and leaving out cult PR reasons). This is my personal guess, I do not guarantee that it is correct.
Is an inability to actually deny a thing equivalent to a belief that negation of that belief has a positive probability? Well, logically they are somewhat similar, but these two ways to express similar ideas certainly have different connotations and leave very different impressions in the listener’s mind what was the person’s actual degree of belief.
(I must add that I personally do not like speculating about another person’s motivations why he did what he did when I actually have no way of knowing them)
I think many users do not think it’s a serious danger, but it’s still banned here. It is IMO reasonable for outsiders to judge the community as a whole by our declared policies.
Coming up with absurd ideas is not a problem. Plenty of absurd things are posted on LW all the time. The problem is that the community took it as a genuine danger.
If EY made a bad decision at the time that he now disagreed with, surely he would have reversed it or at least dropped the ban for future posts. A huge part of what this site is all about is being able to recognize when you’ve made a mistake and respond appropriately. If EY is incapable of doing that then that says very bad things about everything we do here.
What’s cultish as hell to me is having leaders that would wilfully deceive us. If there are some nonpublic rules under which the basilisk is being censored, what else might also be being censored?
Well, nobody in LW community is without flaws. People often fail (or sometimes not even try) to live up to the high standards of being a good rationalist. The problem is that in some internet forums “judging the community” somehow becomes something like “this is what LW makes you to believe, and even if they deny it, they do it only because not doing it would give them a bad image” or “they are a cult that wants you to believe in their robot god” which are such gross misrepresentations of LW (or even thedrama surrounding the basilisk stuff) that even after considering Hanlon’s razor one is left wondering whether that level of misinterpretation is possible without at least some amount of intentional hostility. I would guess that nowadays a large part of annoyance at somebody even bringing this topic up is a reaction to this perceived hostility.
If EY is incapable of doing that then that says very bad things about everything we do here.
No, neither it says very bad things about everything we do here, nor about everything we do here. Whenever EY makes a mistake and fails to recognize and admit it, it is his personal failing to live up to the standards he wrote about so much. You may object that not enough people called him out on that on LW itself, but it was my impression that many of those that do e.g. on reddit seem to be LW users (as currently there are few related discussions here on LW, there is no context to do that here, besides, EY rarely comments here anymore). In addition to that on this thread there seems to be several LW users who agree with you, thus definitely you are not a lone voice, among LWers there seem to be many different opinions. Besides, on that reddit thread he seems to basically admit that, in fact, he did make a lot of mistakes in handling this situation.
It has just dawned to me that while we are talking about censorship, at the same time we are having this discussion. And frankly, I do not remember when was the last time a comment was deleted solely for bringing this topic up. Maybe the ban has been silently lifted or at least is no longer ever enforced (even though there were no public announcement about this), leaving everything to the local social norm? However, I would guess that due to the said social norm one could predict that if one posted about this topic, unless one made really clear that one is bringing this topic up due to the genuine curiosity (and having a genuinely interesting question) and not for the sake of trolling or “let’s see what will happen”, o trying to make fun of people and their identity, one would receive a lot of downvotes (due to being pattern matched, which sustains the social norm of not touching this topic). I feel that I should add, that I wouldn’t advice you to test this hypothesis, because that would probably be considered as bringing the topic up for the sake of bringing it up. I’m not claiming the situation is perfect, and I would agree that in the ideal case, the free marketplace of ideas should prevail and this discrepancy between the current situation and the ideal case should be solved somehow.
May I suggest reading Singularity Sky by Charles Stross, which has precisely such a menacing future AI as an antagonist? (Spoiler: no basilisk memes involved in the plot; they’re obviously not obvious to everyone who thinks of this scenario.)
The overwhelming majority of everyone on LessWrong, now and previously, believes that The Thing is completely ridiculous and would never work at all. Last I heard, Eliezer barely gave thought to the concept that it would really work, but instead blew up at the fact that hapless, innocent readers were being very stressed out by their lack of understanding of why it can’t really work.
Roko’s Basilisk legitimately demonstrates a problem with LW. “Rationality” that leads people to believe such absurd ideas is messed up, and 1) the presence of a significant number of people psychologically affected by the basilisk and 2) the fact that Eliezer accepts that basilisk-like ideas can be dangerous are signs that there is something wrong with the rationality practiced here.
My contrarian idea: Roko’s basilisk is no big deal, but intolerance of making, admitting, or accepting mistakes is cultish as hell.
Are you sure you have pinpointed the right culprit? Why exactly “rationality”? “Zooming in” and “zooming out” would lead to potentially different conclusions. E.g. G.K.Chesterton would probably blame atheism[1]. Zooming out even more, for example, someone immersed in Eastern thought might even blame Western thought in general. Despite receiving vastly disproportionate share of media attention it was such a small part of LessWrong history and thought (by the way, is anything that any LWer ever came up with a part of LW thought?) that it seems to wrong to put the blame on LessWrong or rationality in general.
Furthermore, which would you say is better, an ability to formulate an absurd idea and then find its flaws (or, for e.g. mathematical ideas, exactly under what strange conditions they hold) or inability to formulate absurd ideas at all? Ability to come up with various absurd ideas is an unavoidable side effect of having an imagination. What is important is not to start believing it immediately, because in the history of any really new and outlandish idea at the very beginning there is an important asymmetry (which arises due to the fact that coming up with any complicated idea takes time) - an idea itself has already been invented but the good counterarguments do not yet exist (this is similar to the situation where a new species is introduced to an island where it does not have natural predators, which are introduced only later). This also applies to the moment when a new outlandish idea is introduced to your mind and you haven’t heard any counterarguments by that moment, one must nevertheless exercise caution. Especially if that new idea is elegant and thought provoking whereas all counterarguments are comparatively ugly and complicated and thus might feel unsatisfactory even after you have heard them.
Was there really a significant number of people or is this just, well, an urban legend? The fact that some people are affected is not particularly surprising—it seems to be consistent with the existence of e.g. OCD. Again, one must remember that not everyone thinks the same way and the common thing between people affected might have been something other than acquaintance with LW and rationality which you seem to imply (correct me if my impression was wrong).
I think it is better to give Eliezer a chance to explain himself why he did what he did. My understanding is that whenever someone introduces a person to new variant of this concept without explaining proper counterarguments it takes time for that person to acquaint themselves with them. In very specific instances that might lead to unnecessary worrying about it and potentially even some actions (most people would regard this idea as too outlandish and too weird whether or not it was correct and compartmentalize everything even if it was). A clever devil’s advocate could potentially come up with more and more elaborate versions of this idea which take more and more time to take down. As you can see, it is not necessary for any form of this idea to be correct for this gap to expand.
Personally I understand (and share) the appeal of various interesting speculative ideas and and the frustration that someone thinks that this is supposedly bad for some people, which seems against my instincts and the highly valuable norm of free marketplace of ideas.
At this point in time, however, the basilisk seems to be more often brought up in order to dismiss all LW, rather than only this specific idea, thus it is no wonder that many people get defensive even if they do not believe it.
All of this does not touch the question whether the whole situation was handled the way it should have been handled.
[1] Although the source says that famous quote is misattributed. Huh. I remember reading a similar idea in one of “Father Brown” short stories. I’ll have to check it.
(excuse my english, feel free to correct mistakes)
The quotes indicate that I’m not blaming rationality, I’m blaming something that’s called rationality. You’re replying as if I’m blaming real rationality, which I’m not.
Censoring substantial references to the basilisk was partly done in the name of protecting the people affected. This requires that there be a significant number of people, not just that there be the normal number of people who can be affected by any unusual idea.
His explanations have varied. The explanation you linked to is fairly innocuous; it implies that he is only banning discussion because people get harmed when thinking about it. Someone else linked a screengrab of Eliezer’s original comment which implies that he banned it because it can make it easier for superintelligences to acausally blackmail us, which is very different from the one you linked.
Curiously, it is not necessary. For example, it would suffice that people who do the censoring overestimate the number of people that might need protection. Or consider PR explanation that I gave in another comment which similarly does not require a large number of people affected. Some other parts of your comment are also addressed there.
It is certainly possible that few people were affected by the Basilisk, and the people who do the censoring either overestimate the number or are just using it as an excuse. But this reflects badly on LW all by itself, and also amounts to “you cannot trust the people who do the censoring”, a position which is at least as unpopular as my initial one.
I would guess that the dislike of censorship is not an unpopular position, whatever its motivations.
It’s whatever makes LW different from the wider population, even the wider nerdy-western-liberal-college-educated cluster. The general population of atheists does not have problems with basilisks, and laughs them off when you describe them to them.
It also received a disproportionate amount of ex cathedra moderator action. Which things are so important to EY that he feels it necessary to intervene directly and in a massively controversial way? By their actions we can conclude that the Basilisk is much more important to the LW leadership than e.g. the illegitimate downvoting that drove danerys away.
I don’t think this addresses the original argument. If these ideas are dangerous to us then we are doing something wrong. If you’re saying that danger is an unavoidable cost of being able to generate interesting ideas, then the large number of other groups who seem to come up with interesting ideas without ideas that present a danger to them seems like a counterexample.
I don’t know, but the LW leadership’s statements seem to be grounded in the claim that there were
At the time the Basilisk episode happened Eliezer was a lot more active in general then when the illegitimate downvoting happened.
If you look at the self professed skeptic community there are episodes such as elevator gate.
If you go a bit further back and look at what Stalin did, I would call the ideas on which he acted dangerous.
It’s pretty easy to speak about a lot of topics in a way that the people you are talking to laugh and don’t take the idea seriously. A bunch of that atheist population also treats their new atheism like a religion and closes itself from alternative ideas that sound weird. For practical purposes they are religious and do have a fence against taking new ideas seriously.
What ideas does the general population of atheists have in common besides the lack of belief in God? And what interesting ideas can you derive from that? F.Dostoevsky (who wasn’t even an atheist) seems to have thought that from this one could derive that everything is morally permitted. Maybe some atheistic ideas seemed new, interesting and outlandish in the past when there were few atheists (e.g. separation of church and state), but nowadays they are part of common sense.
No, the claim of this hypothetical Chesterton would not be that atheism creates new weird ideas. It would be that by rejecting god you lose the defense against various weird ideas (“It’s the first effect of not believing in God that you lose your common sense.”—G.K.Chesterton). It is not general atheism, it is specific atheist groups. And in the history of the world, there were a lot of atheists who believed in strange things. E.g. some atheists believe in reincarnation or spiritism. Some believe that the Earth is a zoo kept by aliens. In previous times, some revolutionaries (led not by their atheism, but by other ideologies) believed that just because the social order is not god given it could be easily changed into basically anything. The hypothetical Chesterton would probably claim that had all these people closely followed church’s teachings they would not have believed in these follies since the common sense provided by the traditional christianity would have prevented them. And he would probably be right. The hypthetical Chesterton would probably think that the basilisk is yet another thing in the long list of things some atheists stupidly believe.
Yes, on LessWrong the weirdness heuristic is used less than in more general atheist/skeptic community (in my previous post I have already mentioned why I think it is often useful), and it is considered bad to dismiss the idea if the only counterargument to it is that is weird. Difference in acceptance of weirdness heuristic probably comes from different mentalities: trying to become more rational vs. a more conservative strategy of trying to avoid being wrong (e.g. accepting epistemic learned helplessness when faced with weird and complicated arguments and defaulting to the mainstream position). This difference may reduce a person’s defenses against various new and strange ideas. But even then, one of the most upvoted LW posts of all times already talks about this danger.
Nevertheless, while you claim that general population of atheists “laughs them off when you describe them to them.”, it is my impression that the same is true here, on LessWrong, as absolute majority of LWers do not consider it as a serious thing (sadly, I do not recall any survey asking about that). It is just a small proportion of LWers that believe in this idea. Thus it cannot be “whatever makes LW different from the wider population”, it must be whatever makes that small group different from the wider LW population, because even after rejecting following tradition (which would be the hypothetical Chesterton’s explanation) and diminished usage of weirdness heuristic (which would be average skeptic’s explanation) majority of LWers still do not believe it. And the reasons why some LWers become defensive when someone brings it up are probably very similar to those described in a blog post “Weak Men Are Superweapons” by Yvain.
One could argue that LessWrong thought made it possible to formulate such an idea. Which I had already addressed in my previous post. Once you have a wide vocabulary of ideas you can come up with many things. It is important to be able to find out if the thing you came up with is true.
I do not think that thinking about basilisk is dangerous to us. Maybe it is to some people with OCD or something similar, I do not know. I talked about absurdity, not danger. It seems to me that instead of restricting our imagination (so as to avoid coming up with absurd things), we should let it run free and try improve our ability to recognize which of these imagined ideas are actually true.
I do not know what exactly did Eliezer think when he decided. I am not him. In fact, I wasn’t even there when it happened. I have no way of knowing whether he had actually had a clear reason at the time or simply freaked out and made an impulsive decision, or actually believed it at that moment (at least to the extent of being unable to immediately rule it out completely, which might have led to censor that post in order to postpone the argument). However, I have an idea which I find at least somewhat plausible. This is a guess.
Suppose even a very small number of people (let’s say 2-4 people) were affected (again, let’s remember that they would be very atypical, I doubt that having, e.g. OCD would be enough) in a way that instead of only worrying about this idea, they would actually take action and e.g. sell the large part of their possessions and donate it to (what was then known as) SIAI, leave their jobs to work on FAI or start donating all their income (neglecting their families) out of fear of this hypothetical torture. Now that would be a PR disaster many orders of magnitude larger than anything basilisk related we have now. Now, when people use the word “cult”, they seem to seem to use it figuratively, as a hyperbole (e.g.), in that case people and organizations who monitor real cults would actually label SIAI as a literal one (whether SIAI likes it or not). Now that would be a disaster both for SIAI and the whole friendly AI project, possibly burying it forever. Considering that Eliezer worried about such things even before this whole debacle, it must have crossed his mind and this possibility must have looked very scary leading to the impulsive decision and what we can now see as improper handling of the situation.
Then why not claim that you do this for PR reasons instead of caring about psychological harm of those people? Firstly, one may actually care about those people, especially if one knows one of them personally (which seems to be the case from the screenshot provided by XiXiDu and linked by Jiro). And even in more general case, talking about caring usually looks better than talking about PR. Secondly, “stop it, it is for your own safety” probably stops more people from looking than “stop it, it might give us a bad PR” (as we can see from the recent media attention, the second reason stops basically nobody). Thirdly, even if Eliezer personally met all those people (once again, remember that they would be very atypical) affected and explicitly asked not to do anything, they would understand that he has SIAI PR at stake and thus an incentive to lie to them about what they should do, and they wouldn’t want to listen to him (as even remote possibility of torture might seem scary) and, e.g. donate via another person. Or find their own ways of trying to create fAI. Or whatever they can come up with. Or find their ways to fight the possible creation of AI. Or maybe even something like this. I don’t know, this idea did not cause me nightmares therefore I do not claim to understand the mindset of those people. Here I must note that in no way I am claiming that because a person has an OCD they would actually do that.
Nowadays, however, what most people seem to want to talk about is not the idea of a basilisk itself, but rather the drama surrounding it. As it is sometimes used to dismiss all LW (again, for reasons similar to this), many people get very defensive and pattern match those who bring this topic up with an intent of potentially discussing it (and related events) to trolls who do it just for the sake of trolling. Therefore this situation might feel worse for some people, especially those who are not targeted by the mass downvoting or have so much karma they can’t realistically be significantly affected by it.
I feel like I am putting a lot of effort to steelman everything. I guess I, too, got very defensive, potentially for the reasons mentioned in that SlateStarCodex post. Well, maybe everything was just a combination of many stupid decisions, impulsive behaviour and after-the-fact rationalizations, which, after all, might be the simplest explanation. I don’t know. Well, as I wasn’t even there, there must people who would be better informed about the events and better suited to argue.
XiXiDu’s screenshot is damning because it indicates that Eliezer banned the Basilisk because he thought a variation on it might work, not because of either PR reasons or psychological harm.
Unless you think he was lying about that for the same reason he might want to lie about psychological harm.
Well, in that post by Xixidu, there is a quote by Mitchell Porter (that is approved by Eliezer) that, combined with the [reddit post] I have linked earlier, seems he was not able to provide a proof that no variation of basilisk would ever work given that there are more than one possible decision theory, including some exotic and obscure ones that are not yet invented (but who knows what will be invented in the future). Eliezer seems to think that humans minds are unable to actually rigorously follow such a decision theory strictly enough that would be required for such a concept to work. But the human ability is such a vague concept, it is not clear how one can give a formal proof.
However, it seems to me that an inability to provide a formal proof seems to be an unlikely reason to freak out. What (I guess) has happened was that this inability to provide a proof, combined with that unnamed SIAI person’s nightmares (I would guess that Eliezer knows all SIAI people personally) and the fear of the aforementioned potential PR disaster might have resulted into the feeling of losing control of a situation and made him panic, thus resulting into that nervous and angry post, emphasizing the danger and need to protect some people (and leaving out cult PR reasons). This is my personal guess, I do not guarantee that it is correct.
Is an inability to actually deny a thing equivalent to a belief that negation of that belief has a positive probability? Well, logically they are somewhat similar, but these two ways to express similar ideas certainly have different connotations and leave very different impressions in the listener’s mind what was the person’s actual degree of belief.
(I must add that I personally do not like speculating about another person’s motivations why he did what he did when I actually have no way of knowing them)
I think many users do not think it’s a serious danger, but it’s still banned here. It is IMO reasonable for outsiders to judge the community as a whole by our declared policies.
Coming up with absurd ideas is not a problem. Plenty of absurd things are posted on LW all the time. The problem is that the community took it as a genuine danger.
If EY made a bad decision at the time that he now disagreed with, surely he would have reversed it or at least dropped the ban for future posts. A huge part of what this site is all about is being able to recognize when you’ve made a mistake and respond appropriately. If EY is incapable of doing that then that says very bad things about everything we do here.
What’s cultish as hell to me is having leaders that would wilfully deceive us. If there are some nonpublic rules under which the basilisk is being censored, what else might also be being censored?
Well, nobody in LW community is without flaws. People often fail (or sometimes not even try) to live up to the high standards of being a good rationalist. The problem is that in some internet forums “judging the community” somehow becomes something like “this is what LW makes you to believe, and even if they deny it, they do it only because not doing it would give them a bad image” or “they are a cult that wants you to believe in their robot god” which are such gross misrepresentations of LW (or even thedrama surrounding the basilisk stuff) that even after considering Hanlon’s razor one is left wondering whether that level of misinterpretation is possible without at least some amount of intentional hostility. I would guess that nowadays a large part of annoyance at somebody even bringing this topic up is a reaction to this perceived hostility.
No, neither it says very bad things about everything we do here, nor about everything we do here. Whenever EY makes a mistake and fails to recognize and admit it, it is his personal failing to live up to the standards he wrote about so much. You may object that not enough people called him out on that on LW itself, but it was my impression that many of those that do e.g. on reddit seem to be LW users (as currently there are few related discussions here on LW, there is no context to do that here, besides, EY rarely comments here anymore). In addition to that on this thread there seems to be several LW users who agree with you, thus definitely you are not a lone voice, among LWers there seem to be many different opinions. Besides, on that reddit thread he seems to basically admit that, in fact, he did make a lot of mistakes in handling this situation.
It has just dawned to me that while we are talking about censorship, at the same time we are having this discussion. And frankly, I do not remember when was the last time a comment was deleted solely for bringing this topic up. Maybe the ban has been silently lifted or at least is no longer ever enforced (even though there were no public announcement about this), leaving everything to the local social norm? However, I would guess that due to the said social norm one could predict that if one posted about this topic, unless one made really clear that one is bringing this topic up due to the genuine curiosity (and having a genuinely interesting question) and not for the sake of trolling or “let’s see what will happen”, o trying to make fun of people and their identity, one would receive a lot of downvotes (due to being pattern matched, which sustains the social norm of not touching this topic). I feel that I should add, that I wouldn’t advice you to test this hypothesis, because that would probably be considered as bringing the topic up for the sake of bringing it up. I’m not claiming the situation is perfect, and I would agree that in the ideal case, the free marketplace of ideas should prevail and this discrepancy between the current situation and the ideal case should be solved somehow.
Does “rolling my eyes and reading something else” count as “psychologically affected”?
May I suggest reading Singularity Sky by Charles Stross, which has precisely such a menacing future AI as an antagonist? (Spoiler: no basilisk memes involved in the plot; they’re obviously not obvious to everyone who thinks of this scenario.)
I agree with this so much that, in order to not affect the mechanics of this thread, I’m going to upvote some other post of yours.
wait. now I’m not sure how to vote on THIS comment, which is brilliant.
The overwhelming majority of everyone on LessWrong, now and previously, believes that The Thing is completely ridiculous and would never work at all. Last I heard, Eliezer barely gave thought to the concept that it would really work, but instead blew up at the fact that hapless, innocent readers were being very stressed out by their lack of understanding of why it can’t really work.
If you want to point out LW beliefs that sound crazy to most people, I guess you don’t need to go as far as Roko’s basilisk. FAI or MWI would suffice.