I think the central argument of this post is grossly wrong. Sure, you can find some people who want to censor based on which opinions feel too controversial for their taste. But pretending as if that’s the sole motivation is a quintessential strawman. It’s assuming the dumbest possible reason for why other person has a certain position. It’s like if you criticize the bible, and I assume it’s only because you believe the Quran is the literal word of god instead.
We do not censor other people more conventional-minded than ourselves. We only censor other people more-independent-minded than ourselves. Conventional-minded people censor independent-minded people. Independent-minded people do not censor conventional-minded people. The most independent-minded people do not censor anyone at all.
Bullshit. If your desire to censor something is due to an assessment of how much harm it does, then it doesn’t matter how open-minded you are. It’s not a variable that goes into the calculation.
I happen to not care that much about the object-level question anymore (at least as it pertains to LessWrong), but on a meta level, this kind of argument should be beneath LessWrong. It’s actively framing any concern for unrestricted speech as poorly motivated, making it more difficult to have the object-level discussion.
And the other reason it’s bullshit is that no sane person is against all censorship. If someone wrote a post here calling for the assassination of Eliezer Yudkowsky with his real-life address attached, we’d remove the post and ban them. Any sensible discussion is just about where to draw the line.
I would agree that this post is directionally true, in that there is generally too much censorship. I certainly agree that there’s way too much regulation. But it’s also probably directionally true to say that most people are too afraid of technology for bad reasons, and that doesn’t justify blatantly dismissing all worries about technology. We have to be more specific than that.
Any attempt to censor harmful ideas actually suppresses the invention of new ideas (and correction of incorrect ideas) instead.
Proves too much (like that we shouldn’t ban gain-of-function research).
If your desire to censor something is due to an assessment of how much harm it does
Isn’t that basically always what’s claimed, yet rarely the case? It’s likely either because people cannot tell the difference between their dislikes and what’s harmful to society, or because the correct answer is unintuitive. In either case, as long as a topic is taboo, one is banned from figuring out what the real answer is.
It wasn’t intuitive that legalizing porn was the way to go if you wanted a society with less sexual crimes. It wasn’t intuitive that legalizing alcohol was the way to go. It wasn’t intuitive that legalizing drugs somehow reduced drug-related problems, it wasn’t intuitive in the past that making mental health issues taboo wasn’t a good solution, etc.
“X is bad, we should ban it so that it goes away” is a naive way of thinking. An extremely open-minded person with low intelligence might arrive at such a conclusion if he believes it to be correct, though, but he won’t have all those negative emotions which are associated with pro-censorship viewpoints, and these mentalities are more of a problem than the actual censorship.
we’d remove the post and ban them
I believe there’s a murky border between “speech” and “action” which is not obvious. I’m for free speech in an absolute sense, but if somebody yelled into my ear and caused hearing damage, I wouldn’t consider that “speech” but “assault”. It’s not enough to name concrete examples like “slander”, “threatening” , and “yelling fire in a theatre”, there’s bound to be a simple explanation which separates speech and malicious actions clearly. I believe that such a clear definition will reveal censorship to be objectively nonoptimal
The reason my tone was much more aggressive than normal is that I knew I’d be too conflict averse to respond to this post unless I do it immediately, while still feeling annoyed. (You’ve posted similar things before and so far I’ve never responded.) But I stand by all the points I made.
The main difference between this post and Graham’s post is that Graham just points out one phenomenon, namely that people with conventional beliefs tend to have less of an issue stating their true opinion. That seems straight-forwardly true. In fact, I have several opinions that most people would find very off-putting, and I’ve occasionally received some mild social punishment for voicing them.
But Graham’s essay doesn’t justify the points you make this post. It doesn’t even justify the sentence where you linked to it (“Any attempt to censor harmful ideas actually suppresses the invention of new ideas (and correction of incorrect ideas) instead.”) since he doesn’t discuss censorship.
What bothers me emotionally (if that helps) is that I feel like this post is emotionally manipulative to an extent that’s usually not tolerated on LessWrong. Like, it feels like it’s more appealing to the libertarian/free-speech-absolutism/independent-thinker vibe than trying to be truthseeking. Well, that and that it claims several things that apply to me since I think some things should be censored. (E.g., “The most independent-minded people do not censor anyone at all.” → you’re not independent-minded since you want to censor some things.)
I think the central argument of this post is grossly wrong. Sure, you can find some people who want to censor based on which opinions feel too controversial for their taste. But pretending as if that’s the sole motivation is a quintessential strawman. It’s assuming the dumbest possible reason for why other person has a certain position. It’s like if you criticize the bible, and I assume it’s only because you believe the Quran is the literal word of god instead.
Bullshit. If your desire to censor something is due to an assessment of how much harm it does, then it doesn’t matter how open-minded you are. It’s not a variable that goes into the calculation.
I happen to not care that much about the object-level question anymore (at least as it pertains to LessWrong), but on a meta level, this kind of argument should be beneath LessWrong. It’s actively framing any concern for unrestricted speech as poorly motivated, making it more difficult to have the object-level discussion.
And the other reason it’s bullshit is that no sane person is against all censorship. If someone wrote a post here calling for the assassination of Eliezer Yudkowsky with his real-life address attached, we’d remove the post and ban them. Any sensible discussion is just about where to draw the line.
I would agree that this post is directionally true, in that there is generally too much censorship. I certainly agree that there’s way too much regulation. But it’s also probably directionally true to say that most people are too afraid of technology for bad reasons, and that doesn’t justify blatantly dismissing all worries about technology. We have to be more specific than that.
Proves too much (like that we shouldn’t ban gain-of-function research).
Isn’t that basically always what’s claimed, yet rarely the case? It’s likely either because people cannot tell the difference between their dislikes and what’s harmful to society, or because the correct answer is unintuitive. In either case, as long as a topic is taboo, one is banned from figuring out what the real answer is.
It wasn’t intuitive that legalizing porn was the way to go if you wanted a society with less sexual crimes. It wasn’t intuitive that legalizing alcohol was the way to go. It wasn’t intuitive that legalizing drugs somehow reduced drug-related problems, it wasn’t intuitive in the past that making mental health issues taboo wasn’t a good solution, etc.
“X is bad, we should ban it so that it goes away” is a naive way of thinking. An extremely open-minded person with low intelligence might arrive at such a conclusion if he believes it to be correct, though, but he won’t have all those negative emotions which are associated with pro-censorship viewpoints, and these mentalities are more of a problem than the actual censorship.
I believe there’s a murky border between “speech” and “action” which is not obvious. I’m for free speech in an absolute sense, but if somebody yelled into my ear and caused hearing damage, I wouldn’t consider that “speech” but “assault”. It’s not enough to name concrete examples like “slander”, “threatening” , and “yelling fire in a theatre”, there’s bound to be a simple explanation which separates speech and malicious actions clearly. I believe that such a clear definition will reveal censorship to be objectively nonoptimal
Did you read the Paul Graham article I linked? Do you disagree with it too?
I hadn’t, but did now. I don’t disagree with anything in it.
Fascinating. You’re one of the names on Less Wrong that I associate with positive, constructive dialogue. We may have a scissor statement here.
The reason my tone was much more aggressive than normal is that I knew I’d be too conflict averse to respond to this post unless I do it immediately, while still feeling annoyed. (You’ve posted similar things before and so far I’ve never responded.) But I stand by all the points I made.
The main difference between this post and Graham’s post is that Graham just points out one phenomenon, namely that people with conventional beliefs tend to have less of an issue stating their true opinion. That seems straight-forwardly true. In fact, I have several opinions that most people would find very off-putting, and I’ve occasionally received some mild social punishment for voicing them.
But Graham’s essay doesn’t justify the points you make this post. It doesn’t even justify the sentence where you linked to it (“Any attempt to censor harmful ideas actually suppresses the invention of new ideas (and correction of incorrect ideas) instead.”) since he doesn’t discuss censorship.
What bothers me emotionally (if that helps) is that I feel like this post is emotionally manipulative to an extent that’s usually not tolerated on LessWrong. Like, it feels like it’s more appealing to the libertarian/free-speech-absolutism/independent-thinker vibe than trying to be truthseeking. Well, that and that it claims several things that apply to me since I think some things should be censored. (E.g., “The most independent-minded people do not censor anyone at all.” → you’re not independent-minded since you want to censor some things.)