I downvoted because I think the benefit of making stuff like this socially unacceptable on LW is higher than the cost of the OP getting one less response to their survey. The reasons it might be ” strong-downvote-worthy had it appeared in most other possible contexts” still apply here, and the costs of replacing it with a less-bad example seem fairly minimal.
the costs of replacing it with a less-bad example seem fairly minimal.
Can you elaborate? I think the costs (in the form of damaging the integrity of the inquiry) are quite high. If you’re going to crowdsource a list of unpopular beliefs, and carry out that job honestly, then the list is inevitably going to contain a lot of morally objectionable ideas. After all, being morally objectionable is a good reason for an idea to be unpopular! (I suppose the holders of such ideas might argue that the causal relationship between unpopularity and perception-of-immorality runs in the other direction, but we don’t care what they think.)
Now, I also enjoy our apolitical site culture, which I think reflects an effective separation of concerns: here, we talk aboout Bayesian epistemology. When we want to apply our epistemology skills to contentious object-level topics that are likely to generate “more heat than light”, we take it to someone else’s website. (I recommend /r/TheMotte.) That separation is a good reason to explicitly ban specific topics or hypotheses as being outside of the site’s charter. But if we do that, then we can’t compile a list of unpopular beliefs without lying about the results. Blatant censorship is the best kind!
(Keeping in mind that I have nothing to do with the inquiry and can’t speak for OP)
Why is it desirable for the inquiry to turn up a representative sample of unpopular beliefs? If that were explicitly the goal, I would agree with you; I’d also agree (?) that questions with that goal shouldn’t be allowed. However, I thought the idea was to have some examples of unpopular opinions to use in a separate research study, rather than to directly research what unpopular beliefs LW holds.
If the conclusion of the research turns out to be “here is a representative sample of unpopular LW beliefs: <a set of beliefs that doesn’t include anything too reactionary/politically controversial>”, that would be a dishonest & unfortunate conclusion.
Heh. It’s interesting to even try to define what “representative” means for something that is defined by unpopularity. I guess the best examples are those that are so reprehensible or ludicrous that nobody is willing to even identify them.
I do understand your reluctance to give any positive feedback to an idea you abhor, even when it’s relevant and limited to one post. I look forward to seeing what results from it—maybe it will move the window, as you seem to fear. Maybe it’ll just be forgotten, as I expect.
I’ve upvoted them because I think they are specifically appropriate and on-topic for this post, even though I agree that they’d be unwelcome on most of LW. When discussing (or researching) contrarian and unpopular ideas, it’s a straight-up mistake (selection and survivorship bias) to limit those ideas to only the semi-contrarian ones that fit into the site’s general https://en.wikipedia.org/wiki/Overton_window .
I think people are confused about how to evaluate answers here. Should we upvote opinions we agree with on the object level as usual, or should we upvote opinions base on usefulness for the kind of research the OP is trying to conduct (i.e. not mainstream, but not too obscure/random/quirky either like 2+2=5)?
It seems like most have defaulted to the former interpretation while the most-upvoted comments are advocating the latter. Clear instructions is warranted here; the signal is all mixed up.
A close analogy would be a CNN segment bashing Trump posted on an Alt-Right site: the audience there might be confused as to whether they should dogpile on this post as a representative of the opposing tribe or to upvote the post as a heroic act of exposing the ugly nature of the opposing tribe (usually it’s resolved by an allegiance-declaring intro segment by the OP but isn’t always the case).
That is a dangerous opinion to hold. I believe there is value in all ideas, even if they are horrible to our own subjective views. Stuart has proposed something that may be ridiculous but ignoring it doesn’t provide any insight to why it was proposed. You could easily springboard off of it and propose ideas such as:
It could be possible, very intelligent people whom disagree with the norm trap themselves in dark areas while searching for answers.
Why is it a dangerous opinion to hold? I don’t know about others, but to me at least valuing freedom of expression has nothing to do with valuing the ideas being expressed.
Let us review ideas and make comments without bias. Expressing that some ideas are so bad they cannot be stated is dangerous because we are ignoring them in favour of a bias.
I downvoted because I think the benefit of making stuff like this socially unacceptable on LW is higher than the cost of the OP getting one less response to their survey. The reasons it might be ” strong-downvote-worthy had it appeared in most other possible contexts” still apply here, and the costs of replacing it with a less-bad example seem fairly minimal.
Can you elaborate? I think the costs (in the form of damaging the integrity of the inquiry) are quite high. If you’re going to crowdsource a list of unpopular beliefs, and carry out that job honestly, then the list is inevitably going to contain a lot of morally objectionable ideas. After all, being morally objectionable is a good reason for an idea to be unpopular! (I suppose the holders of such ideas might argue that the causal relationship between unpopularity and perception-of-immorality runs in the other direction, but we don’t care what they think.)
Now, I also enjoy our apolitical site culture, which I think reflects an effective separation of concerns: here, we talk aboout Bayesian epistemology. When we want to apply our epistemology skills to contentious object-level topics that are likely to generate “more heat than light”, we take it to someone else’s website. (I recommend /r/TheMotte.) That separation is a good reason to explicitly ban specific topics or hypotheses as being outside of the site’s charter. But if we do that, then we can’t compile a list of unpopular beliefs without lying about the results. Blatant censorship is the best kind!
(Keeping in mind that I have nothing to do with the inquiry and can’t speak for OP)
Why is it desirable for the inquiry to turn up a representative sample of unpopular beliefs? If that were explicitly the goal, I would agree with you; I’d also agree (?) that questions with that goal shouldn’t be allowed. However, I thought the idea was to have some examples of unpopular opinions to use in a separate research study, rather than to directly research what unpopular beliefs LW holds.
If the conclusion of the research turns out to be “here is a representative sample of unpopular LW beliefs: <a set of beliefs that doesn’t include anything too reactionary/politically controversial>”, that would be a dishonest & unfortunate conclusion.
Heh. It’s interesting to even try to define what “representative” means for something that is defined by unpopularity. I guess the best examples are those that are so reprehensible or ludicrous that nobody is willing to even identify them.
I do understand your reluctance to give any positive feedback to an idea you abhor, even when it’s relevant and limited to one post. I look forward to seeing what results from it—maybe it will move the window, as you seem to fear. Maybe it’ll just be forgotten, as I expect.
Okay, that makes sense.
I’ve upvoted them because I think they are specifically appropriate and on-topic for this post, even though I agree that they’d be unwelcome on most of LW. When discussing (or researching) contrarian and unpopular ideas, it’s a straight-up mistake (selection and survivorship bias) to limit those ideas to only the semi-contrarian ones that fit into the site’s general https://en.wikipedia.org/wiki/Overton_window .
Agreed. There’s no value in spreading this opinion
What did you think was going to happen when you asked people for unpopular opinions?!
I think people are confused about how to evaluate answers here. Should we upvote opinions we agree with on the object level as usual, or should we upvote opinions base on usefulness for the kind of research the OP is trying to conduct (i.e. not mainstream, but not too obscure/random/quirky either like 2+2=5)?
It seems like most have defaulted to the former interpretation while the most-upvoted comments are advocating the latter. Clear instructions is warranted here; the signal is all mixed up.
A close analogy would be a CNN segment bashing Trump posted on an Alt-Right site: the audience there might be confused as to whether they should dogpile on this post as a representative of the opposing tribe or to upvote the post as a heroic act of exposing the ugly nature of the opposing tribe (usually it’s resolved by an allegiance-declaring intro segment by the OP but isn’t always the case).
That is a dangerous opinion to hold. I believe there is value in all ideas, even if they are horrible to our own subjective views. Stuart has proposed something that may be ridiculous but ignoring it doesn’t provide any insight to why it was proposed. You could easily springboard off of it and propose ideas such as:
It could be possible, very intelligent people whom disagree with the norm trap themselves in dark areas while searching for answers.
Why is it a dangerous opinion to hold? I don’t know about others, but to me at least valuing freedom of expression has nothing to do with valuing the ideas being expressed.
Let us review ideas and make comments without bias. Expressing that some ideas are so bad they cannot be stated is dangerous because we are ignoring them in favour of a bias.
They can be stated, nobody is contesting that. They can also be downvoted to hell, which is what I’m arguing for.