Do people think that a discussion forum on the moderation and deletion policies would be beneficial?
I would like to see a top-level post on moderation policy. But I would like for it to be written by someone with moderation authority. If there are special rules for discussing moderation, they can be spelled out in the post and commenters can abide by them.
As a newcomer here, I am completely mystified by the dark hints of a forbidden topic.
Every hypothesis I can come up with as to why a topic might be forbidden founders when I try to reconcile with the fact that the people doing the forbidding are not stupid.
Self-censorship to protect our own mental health? Stupid. Secrecy as a counter-intelligence measure, to safeguard the fact that we possess some counter-measure capability? Stupid. Secrecy simply because being a member of a secret society is cool? Stupid, but perhaps not stupid enough to be ruled out. On the other hand, I am sure that I haven’t thought of every possible explanation.
It strikes me as perfectly reasonable if certain topics are forbidden because discussion of such topics has historically been unproductive, has led to flame wars, etc. I have been wandering around the internet long enough to understand and even appreciate somewhat arbitrary, publicly announced moderation policies. But arbitrary and secret policies are a prescription for resentment and for time wasted discussing moderation policies.
Self-censorship to protect our own mental health? Stupid.
My gloss on it is that this is at best a minor part, though it figures in.
The topic is an idea that has horrific implications that are supposedly made more likely the more one thinks about it. Thinking about it in order to figure out what it may be is a bad idea because you may come up with something else. And if the horrific is horrific enough, even a small rise in the probability of it happening would be very bad in expectation.
More explaining why many won’t think it dangerous at all. This doesn’t directly point anything out, but any details do narrow the search-space:
V fnl fhccbfrqyl orpnhfr lbh unir gb ohl va gb fbzr qrpvqrqyl aba-znvafgernz vqrnf gung ner pbzzba qbtzn urer.
I personally don’t buy this, and think the censorship is an overblown reaction. Accepting it is definitely not crazy, however, especially given the stakes, and I’m willing to self-censor to some degree, even though I hate the heavy-handed response.
Another perspective: I read the forbidden idea, understood it, but I have no sense of danger because (like the majority of humans) I don’t really live my life in a way that’s consistent with all the implications of my conscious rational beliefs. Even though it sounded like a convincing chain of reasoning to me, I find it difficult to have a personal emotional reaction or change my lifestyle based on what seem to be extremely abstract threats.
I think only people who are very committed rationalists would find that there are topics like this which could be mental health risks. Of course, that may include much of the LW population.
(1) I know that the SIAI mission is vitally important.
(2) If we blow it, the universe could be paved with paper clips.
(3) Or worse.
(4) I hereby certify that points 1 & 2 do not give me nightmares.
(5) I accept that if point 3 gives me nightmares that points 1 and 2 did not give me, then I probably should not be working on FAI and should instead go find a cure for AIDS or something.
Although 5 could be easily replaced by “Go earn a lot of money in a startup, never think about FAI again but still donate money to SIAI because you remember that you have some good reason to that you don’t want to think about explicitly.”
I read the idea, but it seemed to have basically the same flaw as Pascal’s wager does. On that ground alone it seemed like it shouldn’t be a mental risk to anyone, but it could be that I missed some part of the argument. (Didn’t save the post.)
My analysis was that it described a real danger. Not a topic worth banning, of course—but not as worthless a danger as the one that arises in Pascal’s wager.
My gloss on it is that this is at best a minor part, though it figures in.
I think that, even if this is a minor part of the reasoning for those who (unlike me) believe in the danger, it could easily be the best, most consensus* basis for an explicit deletion policy. I’d support such a policy, and definitely think a secret policy is stupid for several reasons.
I would like to see a top-level post on moderation policy. But I would like for it to be written by someone with moderation authority. If there are special rules for discussing moderation, they can be spelled out in the post and commenters can abide by them.
As a newcomer here, I am completely mystified by the dark hints of a forbidden topic. Every hypothesis I can come up with as to why a topic might be forbidden founders when I try to reconcile with the fact that the people doing the forbidding are not stupid.
Self-censorship to protect our own mental health? Stupid. Secrecy as a counter-intelligence measure, to safeguard the fact that we possess some counter-measure capability? Stupid. Secrecy simply because being a member of a secret society is cool? Stupid, but perhaps not stupid enough to be ruled out. On the other hand, I am sure that I haven’t thought of every possible explanation.
It strikes me as perfectly reasonable if certain topics are forbidden because discussion of such topics has historically been unproductive, has led to flame wars, etc. I have been wandering around the internet long enough to understand and even appreciate somewhat arbitrary, publicly announced moderation policies. But arbitrary and secret policies are a prescription for resentment and for time wasted discussing moderation policies.
Edit: typo correction—insert missing words
My gloss on it is that this is at best a minor part, though it figures in.
The topic is an idea that has horrific implications that are supposedly made more likely the more one thinks about it. Thinking about it in order to figure out what it may be is a bad idea because you may come up with something else. And if the horrific is horrific enough, even a small rise in the probability of it happening would be very bad in expectation.
More explaining why many won’t think it dangerous at all. This doesn’t directly point anything out, but any details do narrow the search-space: V fnl fhccbfrqyl orpnhfr lbh unir gb ohl va gb fbzr qrpvqrqyl aba-znvafgernz vqrnf gung ner pbzzba qbtzn urer.
I personally don’t buy this, and think the censorship is an overblown reaction. Accepting it is definitely not crazy, however, especially given the stakes, and I’m willing to self-censor to some degree, even though I hate the heavy-handed response.
Another perspective: I read the forbidden idea, understood it, but I have no sense of danger because (like the majority of humans) I don’t really live my life in a way that’s consistent with all the implications of my conscious rational beliefs. Even though it sounded like a convincing chain of reasoning to me, I find it difficult to have a personal emotional reaction or change my lifestyle based on what seem to be extremely abstract threats.
I think only people who are very committed rationalists would find that there are topics like this which could be mental health risks. Of course, that may include much of the LW population.
How about an informed consent form:
(1) I know that the SIAI mission is vitally important.
(2) If we blow it, the universe could be paved with paper clips.
(3) Or worse.
(4) I hereby certify that points 1 & 2 do not give me nightmares.
(5) I accept that if point 3 gives me nightmares that points 1 and 2 did not give me, then I probably should not be working on FAI and should instead go find a cure for AIDS or something.
I feel you should detail point (1) a bit more (explain in more detail what the SIAI intends to do), but I agree with the principle. Upvoted.
I like it!
Although 5 could be easily replaced by “Go earn a lot of money in a startup, never think about FAI again but still donate money to SIAI because you remember that you have some good reason to that you don’t want to think about explicitly.”
I read the idea, but it seemed to have basically the same flaw as Pascal’s wager does. On that ground alone it seemed like it shouldn’t be a mental risk to anyone, but it could be that I missed some part of the argument. (Didn’t save the post.)
My analysis was that it described a real danger. Not a topic worth banning, of course—but not as worthless a danger as the one that arises in Pascal’s wager.
I think that, even if this is a minor part of the reasoning for those who (unlike me) believe in the danger, it could easily be the best, most consensus* basis for an explicit deletion policy. I’d support such a policy, and definitely think a secret policy is stupid for several reasons.
*no consensus here will be perfect.
I think it’s safe to tell you that your second two hypotheses are definitely not on the right track.