I think you are misunderstanding me. The point is that there are two scenarios:
1) Someone doesn’t really know anything about some subject. But they find a conspiracy scenario appealing because they would rather “know” an explanation with little evidence behind it, rather than admit that they don’t know.
2) Information definitely is being hidden from someone, and they say “I want to know that information:”.
Both of these involve someone wanting to know, but “wanting to know” is being used in very different ways. If you say that people should “learn to live without knowing things”, that’s a good point in the first scenario but not so good in the second scenario. And the second scenario is what’s taking place for the information that has been censored from lesswrong. (Considering that your reply was pretty much all about 9/11, do you even know what is being referred to by information that has been censored from lesswrong?)
“learning to live without knowing things” doesn’t mean that you don’t value information. It means that when you can’t/don’t know, you’re not in constant suffering. It means that you don’t get all freaked out and desperate for anything that looks like an answer (e.g. a false conspiracy theory)
It’s the difference between experiencing crippling performance anxiety and just wanting to give a good performance. The difference between “panic mode” and “optimizing mode”. Once you can live with the worst case, fear doesn’t control you any more—but that doesn’t mean you’re not motivated to avoid the worst case!
2) Information definitely is being hidden from someone, and they say “I want to know that information:”.
In the case of 9/11 there is definitely information that’s hidden. Anybody who roughly understands how the US government works should expect that’s true. Anybody who studies the issue in detail will find out that’s true.
do you even know what is being referred to by information that has been censored from lesswrong
Yes, I’m aware of three different instances in which information got censored on Lesswrong. There are additional instances where authors deleted their own posts which you could also call censorship.
I don’t think that the value of discovering the information in any of those three cases of censorship is very high to anyone.
In the case of 9/11 there is definitely information that’s hidden.
The two senses of “wanting to know” can both be applied to 9/11.
Someone who “wants to know” in the sense of ignoring evidence to be able to “know” that 9/11 was caused by a conspiracy is better off not wanting to know.
Someone who wants to know information about 9/11 that is hidden but actually exists is not better off not wanting to know. Wanting to know in this sense is generally a good thing. (Except for privacy and security concerns, but politicians doing things is not privacy, and a politician who says something should be hidden for national security is probably lying).
I don’t think that the value of discovering the information in any of those three cases of censorship is very high to anyone
I was referring to the basilisk. Telling people what the basilisk is is very valuable as criticism of LW, and has high “negative value” to LW itself because of how embarrassing it is to LW.
You think that wanting to know the truth means that you can decide on the outcome of what the information that you don’t have says. That isn’t true.
To the extend that there an interest in weakening Russia and China geopolitically by funding separatists movements within their borders there obviously an interest to be silent about how those movements get funded and which individuals do the funding.
US senator Bob Graham made statements about how crucial information on the potential role of Saudi funding of the 9/11 attack got censored out of the report. (see Wikipedia: http://en.wikipedia.org/wiki/9/11_Commission_Report)
Whether or not you call that a conspiracy is irrelevant. Calling it a conspiracy is just a label.
How many Saudi would have to have what specific ties with Al Qaeda and parts of the US government that it’s a conspiracy_tm? This is far from a black and white affair. Obsessing about the label makes you ignore the real issues that are at stake. The US government might very well be hiding information about people that likely payed for 9/11.
Once you understand that fact you might want to know the information. Unfortunately there no easy way to know especially as an individual. If you want to have a quick fix, then you will believe in a lie. You actually have to be okay with knowing that you don’t know if you don’t want to believe in lies.
I was referring to the basilisk. Telling people what the basilisk is is very valuable as criticism of LW, and has high “negative value” to LW itself because of how embarrassing it is to LW.
Explaining to someone the whole story of what TDT is in a way that the basilisk debate makes sense to them is not an easy task. You are basically telling outsiders a strawman if you try to summarize the basilisk debate. In a lot of fields there are complex argument that seem strange and silly to outsiders, the existence of those cases is no argument against those fields.
Another thing that I learned while doing debating is that you focus on refuting strong arguments of your opponent and not on weak arguments. Good criticism isn’t criticism that focuses on obvious mistakes that someone makes. Good criticism focuses on issues that have actually strong argument and it shows that there are better arguments against the position.
Steelmanning is better than arguing against strawman when you want to be a valuable critic. If a strawman argument about the basilisk is the best you can do to criticize LW, LW is a pretty awesome place.
You are basically telling outsiders a strawman if you try to summarize the basilisk debate. In a lot of fields there are complex argument that seem strange and silly to outsiders, the existence of those cases is no argument against those fields.
-- A whole lot of arguments on LW seem silly to outsiders. I just got finished arguing that it’s okay to kill people to take their organs (or rather, that it’s okay to do so in a hypothetical situation that may not really be possible). Should that also be deleted from the site?
-- LW has a conflict of interest when deciding that some information is so easy to take out of context that it must be suppressed, but when suppressing the information also benefits LW for other reasons. Conflicts of interest should generally be avoided because of the possibility that they taint one’s judgment—even if it’s not possible to prove that the conflict of interest does so.
-- I am not convinced that “they’re crazy enough to fall for the basilisk” is strawmanning LW. Crazy-soiunding ideas are more likely to be false than non-crazy-sounding ideas (even if you don’t have the expertise to tell whether it’s really crazy or just crazy-sounding). Ideas which have not been reviewed by the scientific community are more likely to be false than ideas which have. You can do a legitimate Bayseian update based on the Basilisk sounding crazy.
-- Furthermore, LW doesn’t officially believe in the Basilisk. So it’s not “the Basilisk sounds crazy to outsiders because they don’t understand it”, it’s “even insiders concede that the Basilisk is crazy, it just sounds more crazy to outsiders because they don’t understand it”, which is a much weaker reason to suppress it than the former one.
A whole lot of arguments on LW seem silly to outsiders. I just got finished arguing that it’s okay to kill people to take their organs (or rather, that it’s okay to do so in a hypothetical situation that may not really be possible).
That debate is shared with academic ethics as, IIRC, a standard scenario given as criticism of some forms of utilitarian ethics, is it not? I think that’s a mitigating factor. It may sound funny to discuss ‘quarks’ (quark quark quark! funny sound, isn’t it?) or ‘gluons’ but that also is borrowed from an academic field.
-- A whole lot of arguments on LW seem silly to outsiders. I just got finished arguing that it’s okay to kill people to take their organs (or rather, that it’s okay to do so in a hypothetical situation that may not really be possible). Should that also be deleted from the site?
It’s not deleted because it’s silly to outsiders. You said it was important criticism. It’s not.
LW has a conflict of interest when deciding that some information is so easy to take out of context that it must be suppressed, but when suppressing the information also benefits LW for other reasons.
Discussion like the one we are having here aren’t suppressed on LW. If basilisk censoring would be about that, this discussion would be outside of the limit which it isn’t.
The problem with updating on the basilisk is that you don’t have access to the reasoning based on which the basilisk got censored. If you want to update on whether someone makes rational decisions it makes a lot of sense to focus on instances where the person actually fully opening about why he does what he does.
It’s also a case where there was time pressure to make a decision while a lot of LW discussions aren’t of that nature and intellectual position get developed over months and years. A case where a decision was made within a day is not representative for the way opinions get formed on LW.
Discussion like the one we are having here aren’t suppressed
But outsiders wouldn’t have any idea what we’re talking about (unless they googled “Roko’s Basilisk”),
The problem with updating on the basilisk is that you don’t have access to the reasoning based on which the basilisk got censored. If you want to update on whether someone makes rational decisions it makes a lot of sense to focus on instances where the person actually fully opening about why he does what he does.
Just because you don’t have all information doesn’t mean that the information you do have isn’t useful. Of course updating on “the Basilisk sounds like a crazy idea” isn’t as good as doing so based on completely comprehending it, but that doesn’t mean it’s useless or irrational. Besides, LW (officially) agrees that it’s a crazy idea, so it’s not as if comprehending it would lead to a vastly different conclusion.
And again, LW has a conflict of interest in deciding that reading the Basilisk won’t provide outsiders with useful information. The whole reason we point out conflicts of interest in the first place is that we think certain parties shouldn’t make certain decisions. So arguing “LW should decide not to release the information because X” is inherently wrong—LW shouldn’t be deciding this at all.
It’s also a case where there was time pressure to make a decision while a lot of LW discussions aren’t of that nature and intellectual position get developed over months and years.
There was time pressure when the Basilisk was initially censored. There’s no time pressure now.
Explaining to someone the whole story of what TDT is in a way that the basilisk debate makes sense to them is not an easy task. You are basically telling outsiders a strawman if you try to summarize the basilisk debate. In a lot of fields there are complex argument that seem strange and silly to outsiders, the existence of those cases is no argument against those fields.
What does it mean “to make sense” of “the basilisk” debate? I am curious if you are suggesting that it makes sense to worry about any part or interpretation of it.
No matter what you think about RationalWiki in general, I believe it does a good job at explaining it. But if that is not the case, you are very welcome to visit the talk page there and provide a better account.
I think you are misunderstanding me. The point is that there are two scenarios:
1) Someone doesn’t really know anything about some subject. But they find a conspiracy scenario appealing because they would rather “know” an explanation with little evidence behind it, rather than admit that they don’t know.
2) Information definitely is being hidden from someone, and they say “I want to know that information:”.
Both of these involve someone wanting to know, but “wanting to know” is being used in very different ways. If you say that people should “learn to live without knowing things”, that’s a good point in the first scenario but not so good in the second scenario. And the second scenario is what’s taking place for the information that has been censored from lesswrong. (Considering that your reply was pretty much all about 9/11, do you even know what is being referred to by information that has been censored from lesswrong?)
“learning to live without knowing things” doesn’t mean that you don’t value information. It means that when you can’t/don’t know, you’re not in constant suffering. It means that you don’t get all freaked out and desperate for anything that looks like an answer (e.g. a false conspiracy theory)
It’s the difference between experiencing crippling performance anxiety and just wanting to give a good performance. The difference between “panic mode” and “optimizing mode”. Once you can live with the worst case, fear doesn’t control you any more—but that doesn’t mean you’re not motivated to avoid the worst case!
In the case of 9/11 there is definitely information that’s hidden. Anybody who roughly understands how the US government works should expect that’s true. Anybody who studies the issue in detail will find out that’s true.
Yes, I’m aware of three different instances in which information got censored on Lesswrong. There are additional instances where authors deleted their own posts which you could also call censorship.
I don’t think that the value of discovering the information in any of those three cases of censorship is very high to anyone.
The two senses of “wanting to know” can both be applied to 9/11.
Someone who “wants to know” in the sense of ignoring evidence to be able to “know” that 9/11 was caused by a conspiracy is better off not wanting to know.
Someone who wants to know information about 9/11 that is hidden but actually exists is not better off not wanting to know. Wanting to know in this sense is generally a good thing. (Except for privacy and security concerns, but politicians doing things is not privacy, and a politician who says something should be hidden for national security is probably lying).
I was referring to the basilisk. Telling people what the basilisk is is very valuable as criticism of LW, and has high “negative value” to LW itself because of how embarrassing it is to LW.
You think that wanting to know the truth means that you can decide on the outcome of what the information that you don’t have says. That isn’t true.
To the extend that there an interest in weakening Russia and China geopolitically by funding separatists movements within their borders there obviously an interest to be silent about how those movements get funded and which individuals do the funding.
US senator Bob Graham made statements about how crucial information on the potential role of Saudi funding of the 9/11 attack got censored out of the report. (see Wikipedia: http://en.wikipedia.org/wiki/9/11_Commission_Report) Whether or not you call that a conspiracy is irrelevant. Calling it a conspiracy is just a label.
How many Saudi would have to have what specific ties with Al Qaeda and parts of the US government that it’s a conspiracy_tm? This is far from a black and white affair. Obsessing about the label makes you ignore the real issues that are at stake. The US government might very well be hiding information about people that likely payed for 9/11.
Once you understand that fact you might want to know the information. Unfortunately there no easy way to know especially as an individual. If you want to have a quick fix, then you will believe in a lie. You actually have to be okay with knowing that you don’t know if you don’t want to believe in lies.
Explaining to someone the whole story of what TDT is in a way that the basilisk debate makes sense to them is not an easy task. You are basically telling outsiders a strawman if you try to summarize the basilisk debate. In a lot of fields there are complex argument that seem strange and silly to outsiders, the existence of those cases is no argument against those fields.
Another thing that I learned while doing debating is that you focus on refuting strong arguments of your opponent and not on weak arguments. Good criticism isn’t criticism that focuses on obvious mistakes that someone makes. Good criticism focuses on issues that have actually strong argument and it shows that there are better arguments against the position.
Steelmanning is better than arguing against strawman when you want to be a valuable critic. If a strawman argument about the basilisk is the best you can do to criticize LW, LW is a pretty awesome place.
-- A whole lot of arguments on LW seem silly to outsiders. I just got finished arguing that it’s okay to kill people to take their organs (or rather, that it’s okay to do so in a hypothetical situation that may not really be possible). Should that also be deleted from the site?
-- LW has a conflict of interest when deciding that some information is so easy to take out of context that it must be suppressed, but when suppressing the information also benefits LW for other reasons. Conflicts of interest should generally be avoided because of the possibility that they taint one’s judgment—even if it’s not possible to prove that the conflict of interest does so.
-- I am not convinced that “they’re crazy enough to fall for the basilisk” is strawmanning LW. Crazy-soiunding ideas are more likely to be false than non-crazy-sounding ideas (even if you don’t have the expertise to tell whether it’s really crazy or just crazy-sounding). Ideas which have not been reviewed by the scientific community are more likely to be false than ideas which have. You can do a legitimate Bayseian update based on the Basilisk sounding crazy.
-- Furthermore, LW doesn’t officially believe in the Basilisk. So it’s not “the Basilisk sounds crazy to outsiders because they don’t understand it”, it’s “even insiders concede that the Basilisk is crazy, it just sounds more crazy to outsiders because they don’t understand it”, which is a much weaker reason to suppress it than the former one.
That debate is shared with academic ethics as, IIRC, a standard scenario given as criticism of some forms of utilitarian ethics, is it not? I think that’s a mitigating factor. It may sound funny to discuss ‘quarks’ (quark quark quark! funny sound, isn’t it?) or ‘gluons’ but that also is borrowed from an academic field.
It’s not deleted because it’s silly to outsiders. You said it was important criticism. It’s not.
Discussion like the one we are having here aren’t suppressed on LW. If basilisk censoring would be about that, this discussion would be outside of the limit which it isn’t.
The problem with updating on the basilisk is that you don’t have access to the reasoning based on which the basilisk got censored. If you want to update on whether someone makes rational decisions it makes a lot of sense to focus on instances where the person actually fully opening about why he does what he does.
It’s also a case where there was time pressure to make a decision while a lot of LW discussions aren’t of that nature and intellectual position get developed over months and years. A case where a decision was made within a day is not representative for the way opinions get formed on LW.
But outsiders wouldn’t have any idea what we’re talking about (unless they googled “Roko’s Basilisk”),
Just because you don’t have all information doesn’t mean that the information you do have isn’t useful. Of course updating on “the Basilisk sounds like a crazy idea” isn’t as good as doing so based on completely comprehending it, but that doesn’t mean it’s useless or irrational. Besides, LW (officially) agrees that it’s a crazy idea, so it’s not as if comprehending it would lead to a vastly different conclusion.
And again, LW has a conflict of interest in deciding that reading the Basilisk won’t provide outsiders with useful information. The whole reason we point out conflicts of interest in the first place is that we think certain parties shouldn’t make certain decisions. So arguing “LW should decide not to release the information because X” is inherently wrong—LW shouldn’t be deciding this at all.
There was time pressure when the Basilisk was initially censored. There’s no time pressure now.
You underrate the intelligence of the folks who read LW. If someone wants to know he googles it.
Sure?
What does it mean “to make sense” of “the basilisk” debate? I am curious if you are suggesting that it makes sense to worry about any part or interpretation of it.
No matter what you think about RationalWiki in general, I believe it does a good job at explaining it. But if that is not the case, you are very welcome to visit the talk page there and provide a better account.