apparently we can’t even talk about talking it without the both of us being downvoted.
Many comments like this one get net upvotes. Are you concerned with total or net votes? What are your net votes from such discussions? How possible would it be for you to reshape downvoted-type comments into upvoted-type comments, particularly because any one downvoted comment is unlikely to have received much of your editing attention?
both sides
Comments expressing the lament of “people on both sides” are likely to be downvoted by me. I have many similar loose heuristics, such as “vote down people arguing by definition”, “vote up people changing their mind”, “vote up people citing sources”, “vote down people who do not apply the principle of charity”, and “vote up comments in which people correctly use the word ‘literally’”.
You’re unlikely to avoid tripping any if you make multiple comments, but I think each is fair and generally the type of content I want to see gets communicated. Consequently I suggest being less concerned by total downvotes, even if your only other change is to be more concerned with net downvotes. It’s not really supposed to be possible to avoid tripping any wire any reader has.
But I don’t see you getting net downvotes, so I’m not sure if that is instead your complaint...because that would be weird, as getting a near balance of favorable and unfavorable reviews isn’t too harsh a form of censorship.
I’m concerned not with my karma total but with the net-negative karma score for individual comments. A net negative karma for a comment signals that Less Wrong does not want to see that comment. When I see net negative comments I try to figure out why. For any one comment there are lots of possible explanations for downvoting. But when I see a trend that suggests downvoting heuristics I think are bad I sometimes publicly lament those heuristics. For example, I see a lot of bad pattern matching downvoting where people say things that unexamined resemble theistic apologist arguments. Taking any position that could be considered political also seems to be subject to downvoting. Especially when that position is inconsistent with the values of the local demographic cluster. Laments of downvote heuristics seems to be a rather unpopular comment type as well. These heuristics have a chilling effect on the discussions of those subjects and partly explain Eugine’s complaint:
To be fare, the main problem on LessWrong, as opposed to the world in general, is people engaging in motivated stopping and motivated continuation when discussing these topics in an attempt to avoid being sexist (for some reason race is less of a problem) and/or bigots.
Comments expressing the lament of “people on both sides” are likely to be downvoted by me.
Fair enough as a heuristic, though I’ll note I made it pretty clear which side I thought was worse in the preceding sentences. But don’t worry about your downvote- the heuristic you used was fine by me.
Laments of downvote heuristics seems to be a rather unpopular comment type as well.
Laments of downvote heuristics seem to be about why the complainer’s immediately preceding comments were downvoted.
How often are such complaint framed as laments that political allies were downvoted, much less neutrals or opponents?
If everyone writing a comment of content type X also always added spam links, I would downvote overconfident speculation about why people don’t like content X and how that makes them bad people.
But when I see a trend that suggests downvoting heuristics I think are bad I sometimes publicly lament those heuristics. For example, I see a lot of bad pattern matching downvoting where people say things that unexamined resemble theistic apologist arguments. Taking any position that could be considered political also seems to be subject to downvoting.
Outside of threads I’m personally involved in, I try to downvote any comment which seems detrimental to the overall signal-to-noise ratio on LW. Most often that means posts which are statistically illiterate, incoherent, obviously biased, or poorly written, which I imagine should be uncontroversial. Beyond content and style, though, it’s also possible for a post’s framing to lower the signal-to-noise ratio through a variety of knock-on effects.
Usually this happens by way of halo effects and their negative-affect equivalent (let’s call that a miasma effect, if it lacks a proper name): arguments matching religious apologia too closely, for example, tend to trigger a cluster of negative associations in our largely atheistic audience that prime it for confrontation even if the content itself is benign. Likewise for comments with political framing or drawing unwisely from political examples. I don’t usually click the downvote button on comments like these until there’s evidence of them actually causing problems, but that’s sufficiently common that I still end up burning a lot of votes on them.
I submit that this isn’t a bad heuristic. It’s one that shouldn’t be necessary if we were all free from emotive priming effects, but we clearly aren’t, and exposing ourselves to many sources of them isn’t going to help us get rid of the problem; in the meantime, discouraging such comments seems like a useful way of keeping the shouting down.
Many comments like this one get net upvotes. Are you concerned with total or net votes? What are your net votes from such discussions? How possible would it be for you to reshape downvoted-type comments into upvoted-type comments, particularly because any one downvoted comment is unlikely to have received much of your editing attention?
Comments expressing the lament of “people on both sides” are likely to be downvoted by me. I have many similar loose heuristics, such as “vote down people arguing by definition”, “vote up people changing their mind”, “vote up people citing sources”, “vote down people who do not apply the principle of charity”, and “vote up comments in which people correctly use the word ‘literally’”.
You’re unlikely to avoid tripping any if you make multiple comments, but I think each is fair and generally the type of content I want to see gets communicated. Consequently I suggest being less concerned by total downvotes, even if your only other change is to be more concerned with net downvotes. It’s not really supposed to be possible to avoid tripping any wire any reader has.
But I don’t see you getting net downvotes, so I’m not sure if that is instead your complaint...because that would be weird, as getting a near balance of favorable and unfavorable reviews isn’t too harsh a form of censorship.
I’m concerned not with my karma total but with the net-negative karma score for individual comments. A net negative karma for a comment signals that Less Wrong does not want to see that comment. When I see net negative comments I try to figure out why. For any one comment there are lots of possible explanations for downvoting. But when I see a trend that suggests downvoting heuristics I think are bad I sometimes publicly lament those heuristics. For example, I see a lot of bad pattern matching downvoting where people say things that unexamined resemble theistic apologist arguments. Taking any position that could be considered political also seems to be subject to downvoting. Especially when that position is inconsistent with the values of the local demographic cluster. Laments of downvote heuristics seems to be a rather unpopular comment type as well. These heuristics have a chilling effect on the discussions of those subjects and partly explain Eugine’s complaint:
Fair enough as a heuristic, though I’ll note I made it pretty clear which side I thought was worse in the preceding sentences. But don’t worry about your downvote- the heuristic you used was fine by me.
Laments of downvote heuristics seem to be about why the complainer’s immediately preceding comments were downvoted.
How often are such complaint framed as laments that political allies were downvoted, much less neutrals or opponents?
If everyone writing a comment of content type X also always added spam links, I would downvote overconfident speculation about why people don’t like content X and how that makes them bad people.
Outside of threads I’m personally involved in, I try to downvote any comment which seems detrimental to the overall signal-to-noise ratio on LW. Most often that means posts which are statistically illiterate, incoherent, obviously biased, or poorly written, which I imagine should be uncontroversial. Beyond content and style, though, it’s also possible for a post’s framing to lower the signal-to-noise ratio through a variety of knock-on effects.
Usually this happens by way of halo effects and their negative-affect equivalent (let’s call that a miasma effect, if it lacks a proper name): arguments matching religious apologia too closely, for example, tend to trigger a cluster of negative associations in our largely atheistic audience that prime it for confrontation even if the content itself is benign. Likewise for comments with political framing or drawing unwisely from political examples. I don’t usually click the downvote button on comments like these until there’s evidence of them actually causing problems, but that’s sufficiently common that I still end up burning a lot of votes on them.
I submit that this isn’t a bad heuristic. It’s one that shouldn’t be necessary if we were all free from emotive priming effects, but we clearly aren’t, and exposing ourselves to many sources of them isn’t going to help us get rid of the problem; in the meantime, discouraging such comments seems like a useful way of keeping the shouting down.
If someone is going to be pulling a thread dangerously close to politics, we should expect those comments to be held to a higher standard.