I’m relatively new to the site and I wasn’t aware of any censorship.I suppose I can imagine that it might be useful and even necessary to censor things, but I have an intuitive aversion to the whole business. Plus I’m not sure how practical it is, since after you posted that I googled lesswrong censorship and found out what was being censored. I have to say, if they’re willing to censor stuff that causes nightmares then they ought to censor talk of conspiracies, as I can personally attest that that has caused supreme discomfort. They are a very harmful meme and positing a conspiracy can warp your sense of reality. I have bipolar, and I was taking a medicine that increases the level of dopamine in my brain to help with some of the symptoms of depression. Dopamine (I recently rediscovered) increased your brain’s tendency to see patterns, and I had to stop talking a very helpful medication after reading this site. Maybe it would have happened anyway, but the world of conspiracy theories is very dark and my journey there was triggered by his writings. I guess most of the content on this site is disorienting though, but perhaps some clarification about what he thinks the benefits of conspiracies are and their extent should be would help.
Also, the content on this site is pretty hard hitting in a lot of ways, I find it inconsistent to censor things to protect sensitive people who think about AI but not people who are sensitive to all the other things that are discussed here. I think it’s emblematic of a broader problem with the community, which is that there’s a strong ingroup outgroup barrier, which is a problem when you’re trying to subsist on philanthropy and the ingroup is fairly tiny.
Maybe it would have happened anyway, but the world of conspiracy theories is very dark and my journey there was triggered by his writings.
Many websites about conspiracy theories don’t care much about the truth. They don’t go through the work of checking whether what they are saying is true.
On the other hand organisations such as P2 exist or existed. The Mafia exists. To the extend that we care about truth we can’t claim that aren’t groups of people that coordinate together in secret for the benefits of their members.
Italy is a pretty good country to think about when you want to think about conspiracies because there a lot of publically available information.
It’s actually pretty easy to see flaws in the argument of someone who claims that the US government brought down the twin towers on 9/11 via explosives if you are actually searching for flaws and not only searching for evidence that the claim might be true. The same goes for lizard overlords.
I guess most of the content on this site is disorienting though, but perhaps some clarification about what he thinks the benefits of conspiracies are and their extent should be would help.
Learn to live with not knowing things. Learn to live with uncertainty. Living with uncertainty is one of the core skills as a rationalist. If you don’t know than you don’t know an wanting to know. We live in a very complex world that we don’t fully understand.
Plus I’m not sure how practical it is, since after you posted that I googled lesswrong censorship and found out what was being censored.
You found out what was censored in a way where you don’t understand the debate that was censored in depth and you took no emotional harm.
Learning to live with not knowing things is good advice if you are trying to choose between “I explain this by saying that people are hiding things” and “I don’t have an explanation”.
Learning to live with not knowing things is poor advice in a context where people are actually hiding things from you and what is not known is what the people are hiding rather than whether the people are hiding something. It is especially poor advice where there is a conflict of interest involved—that is, when the same people telling you you’d be better off not knowing also stand to lose from you knowing.
Needless to say, 9/11 and lizard conspiracy theories fall in the first category and the material that has been censored from lesswrong falls in the second category.
Learning to live with not knowing things is poor advice in a context where people are actually hiding things from you and what is not known is what the people are hiding rather than whether the people are hiding something.
No, if you can’t stand thinking that you don’t know how things work you are pretty easy to convince of a lie. You take the first lie that makes a bit of sense in your view of the world. The lie feels like you understand the world. It feels better than uncertainty. Any decent organisation that operates in secret puts out lies to distract people who want to know the truth.
Andy Müller-Maguhn was standing in front of the Chaos Computer Congress in German and managed to give a good description of how the NSA surveils the internet and how the German government lets them spy on German soil. At the time you could have called it a conspiracy theory. Those political Chaos Computer Club people are very aware of what they know and where they are uncertain. That’s required if you want to reason clearly about hidden information.
Needless to say, 9/11 and lizard conspiracy theories fall in the first category and the material that has been censored from lesswrong falls in the second category.
When it comes to 9/11 the government does hide things. 9/11 is not an event where all information is readily available. It’s pretty clear that names of some Saudi’s are hidden. Bin Laden comes from a rich Saudi family and the US wants to keep a good relationship with the Saudi government. I think it’s pretty clear that there some information that the US didn’t want to have in the 9/11 report because the US doesn’t want to damage the relationship with the Saudis.
Various parts of the NSA and CIA do not want to share all their information about what they are doing with Congressional Inquiries. As a result they hide information from the 9/11 commission. The NSA wants to have a lot of stuff out of the public eye that could be find out if a congressional commission would dig around and get full cooperation. The chief of the NSA lied under oath to congress about the US spying program. A congressional commission that would investigate 9/11 fully would want to look at all evidence that they NSA gathered at that point and that’s not what the NSA wants, even if the NSA didn’t do anything to make 9/11 happen.
If someone finds evidence of the NSA withholding information to a congressional commission that shouldn’t surprise you at all, or should increase your belief that the NSA orchestrated 9/11 because they are always hiding stuff.
Information about Al Qaeda support for the Muslim fighters that Nato helped to fight for the independence of Kosovo isn’t clear.
The extend to which Chechnya Muslims freedom fighter are financed by the Saudis or Western sources isn’t clear. The same goes for Uyghurs.
General information about identities of people who did short selling before 9/11 was hidden because the US government just doesn’t release all information about all short selling publically.
The problem with 9/11 is that people go to school and learn that the government is supposed to tell them the truth and not hide things. Then they grow up a bit and are faced with a world where government constantly hides information and lies. Then those people take the evidence that the government hides information in a case like 9/11 as evidence that the US government caused the twin towers to be destroyed with dynamite.
Politically the question whether to take 9/11 as a lesson to cut the money flow to Muslim ‘freedom fighters’ in Chechnya does matter and it’s something where relevant information gets withhold.
I think you are misunderstanding me. The point is that there are two scenarios:
1) Someone doesn’t really know anything about some subject. But they find a conspiracy scenario appealing because they would rather “know” an explanation with little evidence behind it, rather than admit that they don’t know.
2) Information definitely is being hidden from someone, and they say “I want to know that information:”.
Both of these involve someone wanting to know, but “wanting to know” is being used in very different ways. If you say that people should “learn to live without knowing things”, that’s a good point in the first scenario but not so good in the second scenario. And the second scenario is what’s taking place for the information that has been censored from lesswrong. (Considering that your reply was pretty much all about 9/11, do you even know what is being referred to by information that has been censored from lesswrong?)
“learning to live without knowing things” doesn’t mean that you don’t value information. It means that when you can’t/don’t know, you’re not in constant suffering. It means that you don’t get all freaked out and desperate for anything that looks like an answer (e.g. a false conspiracy theory)
It’s the difference between experiencing crippling performance anxiety and just wanting to give a good performance. The difference between “panic mode” and “optimizing mode”. Once you can live with the worst case, fear doesn’t control you any more—but that doesn’t mean you’re not motivated to avoid the worst case!
2) Information definitely is being hidden from someone, and they say “I want to know that information:”.
In the case of 9/11 there is definitely information that’s hidden. Anybody who roughly understands how the US government works should expect that’s true. Anybody who studies the issue in detail will find out that’s true.
do you even know what is being referred to by information that has been censored from lesswrong
Yes, I’m aware of three different instances in which information got censored on Lesswrong. There are additional instances where authors deleted their own posts which you could also call censorship.
I don’t think that the value of discovering the information in any of those three cases of censorship is very high to anyone.
In the case of 9/11 there is definitely information that’s hidden.
The two senses of “wanting to know” can both be applied to 9/11.
Someone who “wants to know” in the sense of ignoring evidence to be able to “know” that 9/11 was caused by a conspiracy is better off not wanting to know.
Someone who wants to know information about 9/11 that is hidden but actually exists is not better off not wanting to know. Wanting to know in this sense is generally a good thing. (Except for privacy and security concerns, but politicians doing things is not privacy, and a politician who says something should be hidden for national security is probably lying).
I don’t think that the value of discovering the information in any of those three cases of censorship is very high to anyone
I was referring to the basilisk. Telling people what the basilisk is is very valuable as criticism of LW, and has high “negative value” to LW itself because of how embarrassing it is to LW.
You think that wanting to know the truth means that you can decide on the outcome of what the information that you don’t have says. That isn’t true.
To the extend that there an interest in weakening Russia and China geopolitically by funding separatists movements within their borders there obviously an interest to be silent about how those movements get funded and which individuals do the funding.
US senator Bob Graham made statements about how crucial information on the potential role of Saudi funding of the 9/11 attack got censored out of the report. (see Wikipedia: http://en.wikipedia.org/wiki/9/11_Commission_Report)
Whether or not you call that a conspiracy is irrelevant. Calling it a conspiracy is just a label.
How many Saudi would have to have what specific ties with Al Qaeda and parts of the US government that it’s a conspiracy_tm? This is far from a black and white affair. Obsessing about the label makes you ignore the real issues that are at stake. The US government might very well be hiding information about people that likely payed for 9/11.
Once you understand that fact you might want to know the information. Unfortunately there no easy way to know especially as an individual. If you want to have a quick fix, then you will believe in a lie. You actually have to be okay with knowing that you don’t know if you don’t want to believe in lies.
I was referring to the basilisk. Telling people what the basilisk is is very valuable as criticism of LW, and has high “negative value” to LW itself because of how embarrassing it is to LW.
Explaining to someone the whole story of what TDT is in a way that the basilisk debate makes sense to them is not an easy task. You are basically telling outsiders a strawman if you try to summarize the basilisk debate. In a lot of fields there are complex argument that seem strange and silly to outsiders, the existence of those cases is no argument against those fields.
Another thing that I learned while doing debating is that you focus on refuting strong arguments of your opponent and not on weak arguments. Good criticism isn’t criticism that focuses on obvious mistakes that someone makes. Good criticism focuses on issues that have actually strong argument and it shows that there are better arguments against the position.
Steelmanning is better than arguing against strawman when you want to be a valuable critic. If a strawman argument about the basilisk is the best you can do to criticize LW, LW is a pretty awesome place.
You are basically telling outsiders a strawman if you try to summarize the basilisk debate. In a lot of fields there are complex argument that seem strange and silly to outsiders, the existence of those cases is no argument against those fields.
-- A whole lot of arguments on LW seem silly to outsiders. I just got finished arguing that it’s okay to kill people to take their organs (or rather, that it’s okay to do so in a hypothetical situation that may not really be possible). Should that also be deleted from the site?
-- LW has a conflict of interest when deciding that some information is so easy to take out of context that it must be suppressed, but when suppressing the information also benefits LW for other reasons. Conflicts of interest should generally be avoided because of the possibility that they taint one’s judgment—even if it’s not possible to prove that the conflict of interest does so.
-- I am not convinced that “they’re crazy enough to fall for the basilisk” is strawmanning LW. Crazy-soiunding ideas are more likely to be false than non-crazy-sounding ideas (even if you don’t have the expertise to tell whether it’s really crazy or just crazy-sounding). Ideas which have not been reviewed by the scientific community are more likely to be false than ideas which have. You can do a legitimate Bayseian update based on the Basilisk sounding crazy.
-- Furthermore, LW doesn’t officially believe in the Basilisk. So it’s not “the Basilisk sounds crazy to outsiders because they don’t understand it”, it’s “even insiders concede that the Basilisk is crazy, it just sounds more crazy to outsiders because they don’t understand it”, which is a much weaker reason to suppress it than the former one.
A whole lot of arguments on LW seem silly to outsiders. I just got finished arguing that it’s okay to kill people to take their organs (or rather, that it’s okay to do so in a hypothetical situation that may not really be possible).
That debate is shared with academic ethics as, IIRC, a standard scenario given as criticism of some forms of utilitarian ethics, is it not? I think that’s a mitigating factor. It may sound funny to discuss ‘quarks’ (quark quark quark! funny sound, isn’t it?) or ‘gluons’ but that also is borrowed from an academic field.
-- A whole lot of arguments on LW seem silly to outsiders. I just got finished arguing that it’s okay to kill people to take their organs (or rather, that it’s okay to do so in a hypothetical situation that may not really be possible). Should that also be deleted from the site?
It’s not deleted because it’s silly to outsiders. You said it was important criticism. It’s not.
LW has a conflict of interest when deciding that some information is so easy to take out of context that it must be suppressed, but when suppressing the information also benefits LW for other reasons.
Discussion like the one we are having here aren’t suppressed on LW. If basilisk censoring would be about that, this discussion would be outside of the limit which it isn’t.
The problem with updating on the basilisk is that you don’t have access to the reasoning based on which the basilisk got censored. If you want to update on whether someone makes rational decisions it makes a lot of sense to focus on instances where the person actually fully opening about why he does what he does.
It’s also a case where there was time pressure to make a decision while a lot of LW discussions aren’t of that nature and intellectual position get developed over months and years. A case where a decision was made within a day is not representative for the way opinions get formed on LW.
Discussion like the one we are having here aren’t suppressed
But outsiders wouldn’t have any idea what we’re talking about (unless they googled “Roko’s Basilisk”),
The problem with updating on the basilisk is that you don’t have access to the reasoning based on which the basilisk got censored. If you want to update on whether someone makes rational decisions it makes a lot of sense to focus on instances where the person actually fully opening about why he does what he does.
Just because you don’t have all information doesn’t mean that the information you do have isn’t useful. Of course updating on “the Basilisk sounds like a crazy idea” isn’t as good as doing so based on completely comprehending it, but that doesn’t mean it’s useless or irrational. Besides, LW (officially) agrees that it’s a crazy idea, so it’s not as if comprehending it would lead to a vastly different conclusion.
And again, LW has a conflict of interest in deciding that reading the Basilisk won’t provide outsiders with useful information. The whole reason we point out conflicts of interest in the first place is that we think certain parties shouldn’t make certain decisions. So arguing “LW should decide not to release the information because X” is inherently wrong—LW shouldn’t be deciding this at all.
It’s also a case where there was time pressure to make a decision while a lot of LW discussions aren’t of that nature and intellectual position get developed over months and years.
There was time pressure when the Basilisk was initially censored. There’s no time pressure now.
Explaining to someone the whole story of what TDT is in a way that the basilisk debate makes sense to them is not an easy task. You are basically telling outsiders a strawman if you try to summarize the basilisk debate. In a lot of fields there are complex argument that seem strange and silly to outsiders, the existence of those cases is no argument against those fields.
What does it mean “to make sense” of “the basilisk” debate? I am curious if you are suggesting that it makes sense to worry about any part or interpretation of it.
No matter what you think about RationalWiki in general, I believe it does a good job at explaining it. But if that is not the case, you are very welcome to visit the talk page there and provide a better account.
I find it inconsistent to censor things to protect sensitive people who think about AI but not people who are sensitive to all the other things that are discussed here.
To the extent there is censorship of dangerous information on LW, the danger is to the future of mankind rather then to the (very real and I don’t mean to minimize this) feelings of readers.
One could make the argument that anything that harms the mission of lesswrong’s sponsoring organizations is to the detriment of mankind. I’m not opposed to that argument, but googling censorship of lesswrong did not turn up anything I considered to be particularly dangerous. Maybe that just means that the censorship is more effective than I would have predicted, or is indicative or a lack of imagination on my part.
I’d say that “censorship” (things that could be classified or pattern-matched to this word) happens less than once in a year. That could actually contribute to why people speak so much about it; if it happened every day, it would be boring.
From my memory, this is “censored”:
inventing scenarios about Pascal’s mugging by AI
debating, even hypothetically, harm towards specific people or organization
replying to a downvoted post (automatically penalized by −5 karma)
And the options 2 and 3 are just common sense, and could happen on any website. Thus, most talk about “censorship” on LW focuses on the option 1.
(By the way, if you learned about the “basilisk” on RationalWiki, here is a little thing I just noticed today: The RW article has a screenshot of dozens of deleted comments, which you will obviously associate with the incident. Please note that the “basilisk” incident happened in 2010, and the screenshot is from 2012. So this is not the censorship of the original debate. It is probably a censorship of some “why did you remove this comment two years ago? let’s talk about it forever and ever” meta-threads that were quite frequent and IMHO quite annoying at some time.)
Also, when a comment or article is removed, at least the message about the removal stays there. There is no meta-censorship (trying to hide the fact that censorship happened). If you don’t see messages about removed comments at some place, it means no comments were removed there.
By meta-censorship I meant things like removing the content from the website without a trace, so unless you look at the google cache, you have no idea that anything happened, and unless someone quickly makes a backup, you have no proof that it happened.
Leaving the notices “this comment was removed” on the page is precisely what allowed RW to make a nice screenshot about LW censorship. LW itself provided evidence that some comments were deleted. Providing a hyperlink instead of screenshot would probably give the same information.
Also, I am mentioning basilisk now, and I have above 95% confidence that this comment will not be deleted. (One of the reasons is that it doesn’t get into details; it doesn’t try to restart the whole debate. Another reason is that don’t start a new thread.)
There’s not a lot of actual censorship of dangerous information “for the future of mankind”. Or at least, I rate that as fairly unlikely, given that when the scientific groundwork for a breakthrough has been laid, multiple people usually invent it in parallel, close to each-other in time. Which means that unless you can get everyone who researches dangerous-level AI into LW, censoring on LW won’t really help, it will just ensure that someone less scrupulous publishes first.
“Three may keep a secret, if two of them are dead.”
Conspiracy is hard. If you don’t have actual legal force backing you up, it’s nearly impossible to keep information from spreading out of control—and even legal force is by no means a sure thing. The existence of the Groom Lake air station, for example, was suspected for decades before publicly available satellite images made it pointless to keep up even the pretense of secrecy.
For an extragovernmental example, consider mystery religions. These aren’t too uncommon: they’re not as popular as they once were, but new or unusual religions still often try to elide the deepest teachings of their faiths, either for cultural/spiritual reasons (e.g. Gardnerian Wicca) or because they sound as crazy as six generations of wolverines raised on horse tranquilizers and back issues of Weird Tales (e.g. Scientology).
Now, where’s it gotten them? Well, Gardnerian Wiccans will still tell you they’re drinking from a vast and unplumbed well of secret truths, but it’s trivially easy to find dozens of different Books of Shadows (some from less restrictive breakaway lineages, some from people who just broke their oaths) that agree on the broad strokes and many of the details of the Gardnerian mysteries. (Also many others that bear almost no resemblance beyond the name and some version of the Lesser Banishing Ritual of the Pentagram, but never mind that.) As to Scientology, Operation Clambake (xenu.net) had blown that wide open years before South Park popularized the basic outline of what’s charmingly known as “space opera”; these days it takes about ten minutes to fire up a browser and pull down a more-or-less complete set of doctrinal PDFs by way of your favorite nautical euphemism. Less if it’s well seeded.
“But these are just weird minority religions,” you say? “Knowing this stuff doesn’t actually harm my spiritual well-being, because I only care about the fivefold kisses when my SO’s involved and there’s no such thing as body thetans”? Sure, but the whole point of a mystery religion is selecting for conviction. Typically they’re gated by an initiation period measured in years and thousands of dollars, not to mention some truly hair-raising oaths; I don’t find it plausible that science broadly defined can do much better.
You are clearly right that conspiracy is hard. And yet, it is not impossible. Plenty of major events are caused by conspiracies, from the assassination of Julius Caesar to the recent coup in Thailand. In addition, to truly prevent a conspiracy, it is often necessary to do more than merely reveal it; if the conspirators have plausible deniability, then revealing (but not thwarting) the conspiracy can actually strengthen the plotters hands, as they can now co-ordinate more easily with outside supporters.
Successful conspiracies, like any other social organization, need incentive compatibility. Yes, it’s easy to find out the secrets of the Scientology cult. Not so easy to find out the secret recipe for Coca Cola, though.
I find it inconsistent to censor things to protect sensitive people who think about AI but not people who are sensitive to all the other things that are discussed here.
To the extent there is censorship of dangerous information on LW, the danger is to the future of mankind rather then to the (very real and I don’t mean to minimize this) feelings of readers.
Have you asked the people who are able to censor information on LW, or do you just assume this to be the case?
Do the people in charge of LW censor information that are neither dangerous nor spam?
I infer it’s the case from being a regular reader of LW. I don’t know if LW censors other types of information in part because spam is not a well defined category.
dangerous information on LW, the danger is to the future of mankind
I think that would be far overstating the importance of this forum. If Eliezer/MIRI have some dark secrets (or whatever they consider to be dangerous knowledge), they surely didn’t make it to LW.
I’m relatively new to the site and I wasn’t aware of any censorship.I suppose I can imagine that it might be useful and even necessary to censor things, but I have an intuitive aversion to the whole business. Plus I’m not sure how practical it is, since after you posted that I googled lesswrong censorship and found out what was being censored. I have to say, if they’re willing to censor stuff that causes nightmares then they ought to censor talk of conspiracies, as I can personally attest that that has caused supreme discomfort. They are a very harmful meme and positing a conspiracy can warp your sense of reality. I have bipolar, and I was taking a medicine that increases the level of dopamine in my brain to help with some of the symptoms of depression. Dopamine (I recently rediscovered) increased your brain’s tendency to see patterns, and I had to stop talking a very helpful medication after reading this site. Maybe it would have happened anyway, but the world of conspiracy theories is very dark and my journey there was triggered by his writings. I guess most of the content on this site is disorienting though, but perhaps some clarification about what he thinks the benefits of conspiracies are and their extent should be would help.
Also, the content on this site is pretty hard hitting in a lot of ways, I find it inconsistent to censor things to protect sensitive people who think about AI but not people who are sensitive to all the other things that are discussed here. I think it’s emblematic of a broader problem with the community, which is that there’s a strong ingroup outgroup barrier, which is a problem when you’re trying to subsist on philanthropy and the ingroup is fairly tiny.
Many websites about conspiracy theories don’t care much about the truth. They don’t go through the work of checking whether what they are saying is true.
On the other hand organisations such as P2 exist or existed. The Mafia exists. To the extend that we care about truth we can’t claim that aren’t groups of people that coordinate together in secret for the benefits of their members. Italy is a pretty good country to think about when you want to think about conspiracies because there a lot of publically available information.
It’s actually pretty easy to see flaws in the argument of someone who claims that the US government brought down the twin towers on 9/11 via explosives if you are actually searching for flaws and not only searching for evidence that the claim might be true. The same goes for lizard overlords.
Learn to live with not knowing things. Learn to live with uncertainty. Living with uncertainty is one of the core skills as a rationalist. If you don’t know than you don’t know an wanting to know. We live in a very complex world that we don’t fully understand.
You found out what was censored in a way where you don’t understand the debate that was censored in depth and you took no emotional harm.
Learning to live with not knowing things is good advice if you are trying to choose between “I explain this by saying that people are hiding things” and “I don’t have an explanation”.
Learning to live with not knowing things is poor advice in a context where people are actually hiding things from you and what is not known is what the people are hiding rather than whether the people are hiding something. It is especially poor advice where there is a conflict of interest involved—that is, when the same people telling you you’d be better off not knowing also stand to lose from you knowing.
Needless to say, 9/11 and lizard conspiracy theories fall in the first category and the material that has been censored from lesswrong falls in the second category.
No, if you can’t stand thinking that you don’t know how things work you are pretty easy to convince of a lie. You take the first lie that makes a bit of sense in your view of the world. The lie feels like you understand the world. It feels better than uncertainty. Any decent organisation that operates in secret puts out lies to distract people who want to know the truth.
Andy Müller-Maguhn was standing in front of the Chaos Computer Congress in German and managed to give a good description of how the NSA surveils the internet and how the German government lets them spy on German soil. At the time you could have called it a conspiracy theory. Those political Chaos Computer Club people are very aware of what they know and where they are uncertain. That’s required if you want to reason clearly about hidden information.
When it comes to 9/11 the government does hide things. 9/11 is not an event where all information is readily available. It’s pretty clear that names of some Saudi’s are hidden. Bin Laden comes from a rich Saudi family and the US wants to keep a good relationship with the Saudi government. I think it’s pretty clear that there some information that the US didn’t want to have in the 9/11 report because the US doesn’t want to damage the relationship with the Saudis.
Various parts of the NSA and CIA do not want to share all their information about what they are doing with Congressional Inquiries. As a result they hide information from the 9/11 commission. The NSA wants to have a lot of stuff out of the public eye that could be find out if a congressional commission would dig around and get full cooperation. The chief of the NSA lied under oath to congress about the US spying program. A congressional commission that would investigate 9/11 fully would want to look at all evidence that they NSA gathered at that point and that’s not what the NSA wants, even if the NSA didn’t do anything to make 9/11 happen.
If someone finds evidence of the NSA withholding information to a congressional commission that shouldn’t surprise you at all, or should increase your belief that the NSA orchestrated 9/11 because they are always hiding stuff.
Information about Al Qaeda support for the Muslim fighters that Nato helped to fight for the independence of Kosovo isn’t clear.
The extend to which Chechnya Muslims freedom fighter are financed by the Saudis or Western sources isn’t clear. The same goes for Uyghurs.
General information about identities of people who did short selling before 9/11 was hidden because the US government just doesn’t release all information about all short selling publically.
The problem with 9/11 is that people go to school and learn that the government is supposed to tell them the truth and not hide things. Then they grow up a bit and are faced with a world where government constantly hides information and lies. Then those people take the evidence that the government hides information in a case like 9/11 as evidence that the US government caused the twin towers to be destroyed with dynamite.
Politically the question whether to take 9/11 as a lesson to cut the money flow to Muslim ‘freedom fighters’ in Chechnya does matter and it’s something where relevant information gets withhold.
I think you are misunderstanding me. The point is that there are two scenarios:
1) Someone doesn’t really know anything about some subject. But they find a conspiracy scenario appealing because they would rather “know” an explanation with little evidence behind it, rather than admit that they don’t know.
2) Information definitely is being hidden from someone, and they say “I want to know that information:”.
Both of these involve someone wanting to know, but “wanting to know” is being used in very different ways. If you say that people should “learn to live without knowing things”, that’s a good point in the first scenario but not so good in the second scenario. And the second scenario is what’s taking place for the information that has been censored from lesswrong. (Considering that your reply was pretty much all about 9/11, do you even know what is being referred to by information that has been censored from lesswrong?)
“learning to live without knowing things” doesn’t mean that you don’t value information. It means that when you can’t/don’t know, you’re not in constant suffering. It means that you don’t get all freaked out and desperate for anything that looks like an answer (e.g. a false conspiracy theory)
It’s the difference between experiencing crippling performance anxiety and just wanting to give a good performance. The difference between “panic mode” and “optimizing mode”. Once you can live with the worst case, fear doesn’t control you any more—but that doesn’t mean you’re not motivated to avoid the worst case!
In the case of 9/11 there is definitely information that’s hidden. Anybody who roughly understands how the US government works should expect that’s true. Anybody who studies the issue in detail will find out that’s true.
Yes, I’m aware of three different instances in which information got censored on Lesswrong. There are additional instances where authors deleted their own posts which you could also call censorship.
I don’t think that the value of discovering the information in any of those three cases of censorship is very high to anyone.
The two senses of “wanting to know” can both be applied to 9/11.
Someone who “wants to know” in the sense of ignoring evidence to be able to “know” that 9/11 was caused by a conspiracy is better off not wanting to know.
Someone who wants to know information about 9/11 that is hidden but actually exists is not better off not wanting to know. Wanting to know in this sense is generally a good thing. (Except for privacy and security concerns, but politicians doing things is not privacy, and a politician who says something should be hidden for national security is probably lying).
I was referring to the basilisk. Telling people what the basilisk is is very valuable as criticism of LW, and has high “negative value” to LW itself because of how embarrassing it is to LW.
You think that wanting to know the truth means that you can decide on the outcome of what the information that you don’t have says. That isn’t true.
To the extend that there an interest in weakening Russia and China geopolitically by funding separatists movements within their borders there obviously an interest to be silent about how those movements get funded and which individuals do the funding.
US senator Bob Graham made statements about how crucial information on the potential role of Saudi funding of the 9/11 attack got censored out of the report. (see Wikipedia: http://en.wikipedia.org/wiki/9/11_Commission_Report) Whether or not you call that a conspiracy is irrelevant. Calling it a conspiracy is just a label.
How many Saudi would have to have what specific ties with Al Qaeda and parts of the US government that it’s a conspiracy_tm? This is far from a black and white affair. Obsessing about the label makes you ignore the real issues that are at stake. The US government might very well be hiding information about people that likely payed for 9/11.
Once you understand that fact you might want to know the information. Unfortunately there no easy way to know especially as an individual. If you want to have a quick fix, then you will believe in a lie. You actually have to be okay with knowing that you don’t know if you don’t want to believe in lies.
Explaining to someone the whole story of what TDT is in a way that the basilisk debate makes sense to them is not an easy task. You are basically telling outsiders a strawman if you try to summarize the basilisk debate. In a lot of fields there are complex argument that seem strange and silly to outsiders, the existence of those cases is no argument against those fields.
Another thing that I learned while doing debating is that you focus on refuting strong arguments of your opponent and not on weak arguments. Good criticism isn’t criticism that focuses on obvious mistakes that someone makes. Good criticism focuses on issues that have actually strong argument and it shows that there are better arguments against the position.
Steelmanning is better than arguing against strawman when you want to be a valuable critic. If a strawman argument about the basilisk is the best you can do to criticize LW, LW is a pretty awesome place.
-- A whole lot of arguments on LW seem silly to outsiders. I just got finished arguing that it’s okay to kill people to take their organs (or rather, that it’s okay to do so in a hypothetical situation that may not really be possible). Should that also be deleted from the site?
-- LW has a conflict of interest when deciding that some information is so easy to take out of context that it must be suppressed, but when suppressing the information also benefits LW for other reasons. Conflicts of interest should generally be avoided because of the possibility that they taint one’s judgment—even if it’s not possible to prove that the conflict of interest does so.
-- I am not convinced that “they’re crazy enough to fall for the basilisk” is strawmanning LW. Crazy-soiunding ideas are more likely to be false than non-crazy-sounding ideas (even if you don’t have the expertise to tell whether it’s really crazy or just crazy-sounding). Ideas which have not been reviewed by the scientific community are more likely to be false than ideas which have. You can do a legitimate Bayseian update based on the Basilisk sounding crazy.
-- Furthermore, LW doesn’t officially believe in the Basilisk. So it’s not “the Basilisk sounds crazy to outsiders because they don’t understand it”, it’s “even insiders concede that the Basilisk is crazy, it just sounds more crazy to outsiders because they don’t understand it”, which is a much weaker reason to suppress it than the former one.
That debate is shared with academic ethics as, IIRC, a standard scenario given as criticism of some forms of utilitarian ethics, is it not? I think that’s a mitigating factor. It may sound funny to discuss ‘quarks’ (quark quark quark! funny sound, isn’t it?) or ‘gluons’ but that also is borrowed from an academic field.
It’s not deleted because it’s silly to outsiders. You said it was important criticism. It’s not.
Discussion like the one we are having here aren’t suppressed on LW. If basilisk censoring would be about that, this discussion would be outside of the limit which it isn’t.
The problem with updating on the basilisk is that you don’t have access to the reasoning based on which the basilisk got censored. If you want to update on whether someone makes rational decisions it makes a lot of sense to focus on instances where the person actually fully opening about why he does what he does.
It’s also a case where there was time pressure to make a decision while a lot of LW discussions aren’t of that nature and intellectual position get developed over months and years. A case where a decision was made within a day is not representative for the way opinions get formed on LW.
But outsiders wouldn’t have any idea what we’re talking about (unless they googled “Roko’s Basilisk”),
Just because you don’t have all information doesn’t mean that the information you do have isn’t useful. Of course updating on “the Basilisk sounds like a crazy idea” isn’t as good as doing so based on completely comprehending it, but that doesn’t mean it’s useless or irrational. Besides, LW (officially) agrees that it’s a crazy idea, so it’s not as if comprehending it would lead to a vastly different conclusion.
And again, LW has a conflict of interest in deciding that reading the Basilisk won’t provide outsiders with useful information. The whole reason we point out conflicts of interest in the first place is that we think certain parties shouldn’t make certain decisions. So arguing “LW should decide not to release the information because X” is inherently wrong—LW shouldn’t be deciding this at all.
There was time pressure when the Basilisk was initially censored. There’s no time pressure now.
You underrate the intelligence of the folks who read LW. If someone wants to know he googles it.
Sure?
What does it mean “to make sense” of “the basilisk” debate? I am curious if you are suggesting that it makes sense to worry about any part or interpretation of it.
No matter what you think about RationalWiki in general, I believe it does a good job at explaining it. But if that is not the case, you are very welcome to visit the talk page there and provide a better account.
To the extent there is censorship of dangerous information on LW, the danger is to the future of mankind rather then to the (very real and I don’t mean to minimize this) feelings of readers.
One could make the argument that anything that harms the mission of lesswrong’s sponsoring organizations is to the detriment of mankind. I’m not opposed to that argument, but googling censorship of lesswrong did not turn up anything I considered to be particularly dangerous. Maybe that just means that the censorship is more effective than I would have predicted, or is indicative or a lack of imagination on my part.
I’d say that “censorship” (things that could be classified or pattern-matched to this word) happens less than once in a year. That could actually contribute to why people speak so much about it; if it happened every day, it would be boring.
From my memory, this is “censored”:
inventing scenarios about Pascal’s mugging by AI
debating, even hypothetically, harm towards specific people or organization
replying to a downvoted post (automatically penalized by −5 karma)
And the options 2 and 3 are just common sense, and could happen on any website. Thus, most talk about “censorship” on LW focuses on the option 1.
(By the way, if you learned about the “basilisk” on RationalWiki, here is a little thing I just noticed today: The RW article has a screenshot of dozens of deleted comments, which you will obviously associate with the incident. Please note that the “basilisk” incident happened in 2010, and the screenshot is from 2012. So this is not the censorship of the original debate. It is probably a censorship of some “why did you remove this comment two years ago? let’s talk about it forever and ever” meta-threads that were quite frequent and IMHO quite annoying at some time.)
Also, when a comment or article is removed, at least the message about the removal stays there. There is no meta-censorship (trying to hide the fact that censorship happened). If you don’t see messages about removed comments at some place, it means no comments were removed there.
And yet earlier in your post you’re talking about some posts in 2012 about censorship in 2010 being deleted. Smells like meta-censorship to me.
By meta-censorship I meant things like removing the content from the website without a trace, so unless you look at the google cache, you have no idea that anything happened, and unless someone quickly makes a backup, you have no proof that it happened.
Leaving the notices “this comment was removed” on the page is precisely what allowed RW to make a nice screenshot about LW censorship. LW itself provided evidence that some comments were deleted. Providing a hyperlink instead of screenshot would probably give the same information.
Also, I am mentioning basilisk now, and I have above 95% confidence that this comment will not be deleted. (One of the reasons is that it doesn’t get into details; it doesn’t try to restart the whole debate. Another reason is that don’t start a new thread.)
There’s not a lot of actual censorship of dangerous information “for the future of mankind”. Or at least, I rate that as fairly unlikely, given that when the scientific groundwork for a breakthrough has been laid, multiple people usually invent it in parallel, close to each-other in time. Which means that unless you can get everyone who researches dangerous-level AI into LW, censoring on LW won’t really help, it will just ensure that someone less scrupulous publishes first.
“Three may keep a secret, if two of them are dead.”
Conspiracy is hard. If you don’t have actual legal force backing you up, it’s nearly impossible to keep information from spreading out of control—and even legal force is by no means a sure thing. The existence of the Groom Lake air station, for example, was suspected for decades before publicly available satellite images made it pointless to keep up even the pretense of secrecy.
For an extragovernmental example, consider mystery religions. These aren’t too uncommon: they’re not as popular as they once were, but new or unusual religions still often try to elide the deepest teachings of their faiths, either for cultural/spiritual reasons (e.g. Gardnerian Wicca) or because they sound as crazy as six generations of wolverines raised on horse tranquilizers and back issues of Weird Tales (e.g. Scientology).
Now, where’s it gotten them? Well, Gardnerian Wiccans will still tell you they’re drinking from a vast and unplumbed well of secret truths, but it’s trivially easy to find dozens of different Books of Shadows (some from less restrictive breakaway lineages, some from people who just broke their oaths) that agree on the broad strokes and many of the details of the Gardnerian mysteries. (Also many others that bear almost no resemblance beyond the name and some version of the Lesser Banishing Ritual of the Pentagram, but never mind that.) As to Scientology, Operation Clambake (xenu.net) had blown that wide open years before South Park popularized the basic outline of what’s charmingly known as “space opera”; these days it takes about ten minutes to fire up a browser and pull down a more-or-less complete set of doctrinal PDFs by way of your favorite nautical euphemism. Less if it’s well seeded.
“But these are just weird minority religions,” you say? “Knowing this stuff doesn’t actually harm my spiritual well-being, because I only care about the fivefold kisses when my SO’s involved and there’s no such thing as body thetans”? Sure, but the whole point of a mystery religion is selecting for conviction. Typically they’re gated by an initiation period measured in years and thousands of dollars, not to mention some truly hair-raising oaths; I don’t find it plausible that science broadly defined can do much better.
So I’m the only one here who actually took a hair-raising oath before making an account?
You’re not allowed to talk about the oath! Why am I the only one who seems able to keep it?
Because there are different factions at work, you naked ape.
Nah, I hear we traditionally save that for after you earn your 10,000th karma point and take the Mark of Bayes.
You probably need to get those 10K karma points from Main.
You are clearly right that conspiracy is hard. And yet, it is not impossible. Plenty of major events are caused by conspiracies, from the assassination of Julius Caesar to the recent coup in Thailand. In addition, to truly prevent a conspiracy, it is often necessary to do more than merely reveal it; if the conspirators have plausible deniability, then revealing (but not thwarting) the conspiracy can actually strengthen the plotters hands, as they can now co-ordinate more easily with outside supporters.
Successful conspiracies, like any other social organization, need incentive compatibility. Yes, it’s easy to find out the secrets of the Scientology cult. Not so easy to find out the secret recipe for Coca Cola, though.
Have you asked the people who are able to censor information on LW, or do you just assume this to be the case?
Do the people in charge of LW censor information that are neither dangerous nor spam?
I infer it’s the case from being a regular reader of LW. I don’t know if LW censors other types of information in part because spam is not a well defined category.
I think that would be far overstating the importance of this forum. If Eliezer/MIRI have some dark secrets (or whatever they consider to be dangerous knowledge), they surely didn’t make it to LW.