While the LW voting system seems to work, and it is possibly better than the absence of any threshold, my experience is that the posts that contain valuable and challenging content don’t get upvoted, while the most upvotes are received by posts that state the obvious or express an emotion with which readers identify.
I feel there’s some counterproductivity there, as well as an encouragement of groupthink. Most significantly, I have noticed that posts which challenge that which the group takes for granted get downvoted. In order to maintain karma, it may in fact be important not to annoy others with ideas they don’t like—to avoid challenging majority wisdom, or to do so very carefully and selectively. Meanwhile, playing on the emotional strings of the readers works like a charm, even though that’s one of the most bias-encouraging behaviors, and rather counterproductive.
I find those flaws of some concern for a site like this one. I think the voting system should be altered to make upvoting as well as downvoting more costly. If you have to pick and choose what comments and articles to upvote/downnvote, I think people will be voting with more reason.
There are various ways to make voting costlier, but an easy way would be to restrict the number of votes anyone has. One solution would be for votes to be related to karma. If I’ve gained 500 karma, I should be able to upvote or downvote F(500) comments, where F would probably be a log function of some sort. This would both give more leverage to people who are more active contributors, especially those who write well-accepted articles (since you get 10x karma per upvote for that), and it would also limit the damage from casual participants who might otherwise be inclined to vote more emotionally.
If I’ve gained 500 karma, I should be able to upvote or downvote F(500) comments, where F would probably be a log function of some sort.
Um, that math doesn’t work out unless the number of new users expands exponentially fast. You need F(n) to be at least n, and probably significantly greater, in order to avoid a massive bottleneck.
A community is only as good as its constituents. I would hope that there are enough people around who like majority-wisdom-challenging insights, to offset this problem. “Insights” being the key word.
Completely rational points of view that people find offensive cannot be expressed.
This is a site that is supposed to be about countering bias. Countering bias necessarily involves assaulting our emotional preconceptions which are the cause of falsity of thought. Yet, performing such assaults is actively discouraged.
Does that make this site Less Wrong, or More Wrong?
You’re getting downvoted for overconfidence, not for the content of your point of view.
The utilitarian point of view is that beyond some level of salary, more money has very small marginal utility to an average First World citizen, but would have a huge direct impact in utility on people who are starving in poor countries.
Your point is that the indirect impacts should also be considered, and that perhaps when they are taken into account the net utility increase isn’t so clear. The main indirect impact you identify is increasing dependency on the part of the recipients.
Your concern for the autonomy of these starving people is splendid, but the fact remains that without aid their lives will be full of suffering. Your position appears to be “good riddance”. You can’t fault people for being offended at the implied lack of compassion.
I suspect that your appeal for sympathy towards your position is doubly likely to fall on deaf ears as a result. Losing two karma points isn’t the end of the world, and does not constitute suppression. Stop complaining, and invest some effort in presenting your points of view more persuasively.
You’re getting downvoted for overconfidence, not for the content of your point of view.
If denis is just being overconfident, couldn’t we just say “you’re being overconfident here, probably because you neglected to consider …” and reserve downvotes for trolls and nonsense (i.e., comments that clearly deserve to be hidden from view)?
Downvotes signal “would like to see fewer comments like this one”. This certainly applies to trolls and nonsense, but it feels appropriate to use the same signal for comments which, if the author had taken a little more time to compose, readers wouldn’t need to spend time correcting one way or another. The calculation I’ve seen at least once here (and I tend to agree with) is that you should value your readers’ time about 10x more than you value yours.
The appropriate thing to do if you receive downvotes and you’re neither a troll nor a crackpot seems to simply ask what’s wrong. Complaining only makes things worse. Complaining that the community is exhibiting censorship or groupthink makes things much worse.
Looking at the comment in question, Denis claims that charity “only” rewards bad things and discourages good ones. That is nonsense on its face, and it’s combined with mind-killing politics: ideological libertarianism about the immorality of paying taxes that benefit those labelled dysfunctional. I agree with Robin Hanson on this point.
See what I mean about the voting system being broken?
Honestly, the system is doing exactly what it is supposed to be doing. If you think it is broken, I suspect you are expecting it to do something other than its purpose.
When I get frustrated by the karma system it is because I keep wanting more feedback than it provides. But this is a problem with me, not a problem with the system.
But this is a problem with me, not a problem with the system.
Then the solution to this “problem” would be to not care as much about feedback, or to want less feedback than you think you should get? Couldn’t it also be addressed by adding mechanisms for providing more feedback? I don’t see why the problem has to be on one particular side when solutions could be had either way.
ETA: I agree that denisbider appears to be injecting too much political thinking into his comments and calling it rationality, while not providing adequate support of his positions outside of said politics, and that the karma system is justifiably punishing him for it.
I can also see a potential for the current system to have a ‘delicate balance’ where changes trying to improve it could be offset by negative outcomes, but I don’t think that case has been made.
(1) I don’t see my comments as being political. If you perceive them as injecting politics, then I suspect it’s because you are used to hearing similar things in a more political environment. My comments are about reason, empathy, charity, value systems, and how they fit together.
(2) I am unable to substantiate my positions if people don’t respond. When people do respond, then I have some understanding of the differences between my viewpoint and theirs, and can substantiate. But I don’t believe it’s reasonable to expect all possible counter-arguments to be preempted in a comment of reasonable length.
(3) The “delicate balance” argument is specious—it is a form of bias in favor of what already exists. If we had a different system, then you would be calling that system a “delicate balance”.
1) Some of your earlier comments, especially those most negatively rated, set off all of my “political talking points” alarm bells. I note that many of your later comments aren’t so rated, and that you seem to be improving in your message-conveyance.
2) Your replies to replies seem to be going fairly well so far.
3) I agree that it is only potential. Thomblake posted a good link on that very topic, and it is also why I said the case had not been made, and put the phrase in quotes. However, calling it specious and saying I would agree with any system is exactly the sort of thing I was talking about. Just because it’s a potential bias doesn’t mean that it is necessarily in effect, nor that its effects are so strong that it shows things are obviously broken. We do a lot of probabilistic thinking around here...
Since we’re doing probabilistic thinking, I would assign a great probability to the current system being imperfect, simply because (1) it is the system with which the site was designed prior to developing experience, and (2) the system is observed to have faults.
These faults seem to be fixable by making voting costlier, prompting readers to invest more thought when they decide to vote. I don’t even expect that this would necessarily improve my karma, but I think it would increase thoughtfulness, decrease reactivity, and improve quality overall.
There should probably be a daily limit to how many comments people can make, too. I think it would encourage longer and more thoughtful comments rather than shorter and more reactive ones.
it is the system with which the site was designed prior to developing experience
Patently false.
There should probably be a daily limit to how many comments people can make, too. I think it would encourage longer and more thoughtful comments rather than shorter and more reactive ones.
I can also see a potential for the current system to have a ‘delicate balance’ where changes trying to improve it could be offset by negative outcomes, but I don’t think that case has been made.
I already upvoted you before reading this comment. It can take a little time for votes to settle. Also, you can set your threshold to a different value. The default is less than −2.
Incidentally, I also up-voted your comment about how charity is unhelpful because it enables helplessness (even though I disagree) because I definitely think its valuable to have both arguments represented. However, I did expect your comment would be down-voted because my impression is that the group here has already considered Ayn Rand and disagree with her ideologically. I wouldn’t say they found your comment offensive … there’s just certain themes that are developed here more than others and that was an anti-theme note.
Do you think having certain ‘group themes’ is bad for rationality?
My observations aren’t Randian in origin. At least, I haven’t read her books; I even somewhat disapprove of her, from what I know of her idiosyncrasies as a person.
I do think that this is an important topic for this group to consider, because the community is about rationality. My observation is that many commenters seem to not be realizing the proper role of empathy in our emotional spectrum, and are trying to extent their empathy to their broader environment in ways that don’t make sense.
Also, if my anti-empathy comment is being downvoted because it isn’t part of a group theme, then the pro-empathy comments should be downvoted as well, but they are not. This indicates that people vote based on what they agree with, whether it is in context or not—and not based on what is in context, and/or provides food for thought.
Also, if my anti-empathy comment is being downvoted because it isn’t part of a group theme, then the pro-empathy comments should be downvoted as well, but they are not.
This indicates you haven’t understood me: pro-empathy IS the theme here on Less Wrong. For a variety of reasons, this community tends to have ‘humanist goals’. This is considered to not be in conflict with rationality, because rationality is about achieving your goals, not choosing them. If you have a developed rational argument for why less charity would further humanist goals, there may be some interest, but much less interest if your argument seems based on a lack of humanist goals.
But the definition of “humanity” isn’t even coherent, and is actually incompatible with shades of gray that actually exist.
Until these fundamentals are thought out, there can be lots of hot air, but progress toward a goal cannot be made, as long as the goal is incoherent.
It seems to me that the type of humanism you’re talking about is based on an assumption that “other people are like me, and should therefore be just as valuable to me as I am”.
But other people, especially of different cultures and genetic heritage, have strikingly different values, strikingly different perceptions, different capacities to think, understand and create.
The differences are such that drawing the compassion line at the borders of the human race makes about as much sense as at any other arbitrary point in the biological spectrum.
I believe that, to be consistent in valuing empathy as a goal on its own, you have to have empathy with everything. I find that a laudable position. But the sad fact is, most of us here aren’t vegan, nor do even want to be. (I would be if most people were.)
People are selfish, and do not have empathy for everything. In fact, most people pretend to have empathy for the world as a whole, whereas in fact they only have empathy for the closest people around them, and perhaps not even them, when push comes to shove.
All that having been said, and the world being as selfish as it is, when you say that you’re a humanist, that you want to better the lot of other people, and that you contribute 50% of your income to charity (just as an example), you are basically saying that you’re a sucker, and that your empathic circuits are so out of control that you let other people exploit you.
Given that we are the way we are, I think a much more reasonable goal is to foster a world that shares our values, not to foster the existence of the arbitrary people who don’t share our values, but exist today.
People do to some extent vote based on what they agree with, and at least a few make no bones about that. But people also vote based on style. Based on if it feels like you are trying to learn and contribute to our learning or trying to appear superior and gain status. You look like the latter to me. And I think that you could be arguing the same things, in ways that are no less honest, and get positive karma if you just use different words.
I’m no Socrates, but focusing on style instead of essence is incorrect.
Some of the best lessons I’ve learned were from people who were using a very blunt style.
I am not trying to appear superior, nor to gain status. If I wanted that, I would not be using a style which I know is likely to antagonize. I use a blunt style at the expense of my status and for the benefit of the message, not the other way around.
You’re saying some things which I’ve considered attempting to say but have self-censored to some extent due to expecting negative karma. You aren’t necessarily saying them in exactly the way I would have tried to put it, and I don’t necessarily agree with everything you’ve been saying but I broadly agree and have been upvoting most of your recent posts.
I agree with much of what he seems trying to convey. However, in many cases, the style is far too reminiscent of political talking points. Bluntness is useful insofar as it simplifies a message to its essential meaning. Talking points corrupt that process by injecting emotional appeals and loaded terms.
Perhaps I would know better to avoid that if I was more exposed to US culture, but I am originally from Europe and I tend to abhor political wars for their vacuousness, so perhaps I’m using words in ways that reminisce of politics inadvertently.
To remove the word “politics” from my description: You seem very sure of yourself, to the point where it seems you are not taking uncertainty into account where you should be. The views you express seem to be statements about the world, as if they were facts, when discussing things like utilitarian value of certain actions, when there are competing views on the topic, and you do a disservice to the discussion by failing to mention or explain why your opinions are better than the competing theories, or even acknowledging that they are opinions.
You don’t provide the evidence; you provide a statement of “fact” in isolation, sometimes going so far as to claim special knowledge and ask the audience to do things you know very well are not going to make for an easy or quick discussion (like, “Go spend a few years in Africa.”) I found that my alarms deactivated for your response to my comment that we think probabilistically, because the claims were testable and better labeled.
I was also moved by these concerns, and find comments sharing these general traits to degrade norms of discussion (e.g. clarity, use of evidence, distinguishing between normative and descriptive claims).
Thanks Matt. I generally try to take this role because I’m aware that the character traits that allow me to do this are somewhat rare, and that the role is valuable in balance.
I’m also aware of the need to improve my skills of getting the message across, but this takes time to develop.
There is some relevant discussion of the issue of how our empathy/instinctive moral reactions conflict with efficient markets in this interview with Hayek. The whole thing is worth watching but the most relevant part of the interview to this discussion starts at 45:25. Unfortunately Vimeo does not support links directly to a timestamp so you have to wait for the video to load before jumping to the relevant point.
ETA a particularly relevant quote:
But we are up against this very strong, and in a sense justified resistance of our instincts and that’s our whole problem. A society which is efficient cannot be just. And unfortunately a society which is not efficient cannot maintain the present population of the world. So I think our instincts will have to learn. We shall perhaps for generations still be fighting the problem and fluctuating from one position to the other.
I know exactly why the majority of people do not like the kind of relative status which a free competitive society produces. But every time they try to correct this they start on a course where to apply the same principle universally destroys the whole system.
Now I think that perhaps for the next 200 years we will be fluctuating from the one direction to the other. Trying to satisfy our feeling of justice, and leading away from efficiency, finding out that in trying to cure poverty we really increase poverty, then returning to the other system, a more effective system to abolish poverty, but on a more unjust principle. And how long it will have to last before we learn to discipline our feelings I can’t predict.
While the LW voting system seems to work, and it is possibly better than the absence of any threshold, my experience is that the posts that contain valuable and challenging content don’t get upvoted, while the most upvotes are received by posts that state the obvious or express an emotion with which readers identify.
I feel there’s some counterproductivity there, as well as an encouragement of groupthink. Most significantly, I have noticed that posts which challenge that which the group takes for granted get downvoted. In order to maintain karma, it may in fact be important not to annoy others with ideas they don’t like—to avoid challenging majority wisdom, or to do so very carefully and selectively. Meanwhile, playing on the emotional strings of the readers works like a charm, even though that’s one of the most bias-encouraging behaviors, and rather counterproductive.
I find those flaws of some concern for a site like this one. I think the voting system should be altered to make upvoting as well as downvoting more costly. If you have to pick and choose what comments and articles to upvote/downnvote, I think people will be voting with more reason.
There are various ways to make voting costlier, but an easy way would be to restrict the number of votes anyone has. One solution would be for votes to be related to karma. If I’ve gained 500 karma, I should be able to upvote or downvote F(500) comments, where F would probably be a log function of some sort. This would both give more leverage to people who are more active contributors, especially those who write well-accepted articles (since you get 10x karma per upvote for that), and it would also limit the damage from casual participants who might otherwise be inclined to vote more emotionally.
Um, that math doesn’t work out unless the number of new users expands exponentially fast. You need F(n) to be at least n, and probably significantly greater, in order to avoid a massive bottleneck.
I thought of that too, but then I realized the karma:upvote conversion rate on posts is 10:1, which complicates the analysis of the karma economy.
If F(n) < n, then yes, karma disappears from the system when voting on comments, but is pumped back in when voting on articles.
It does appear that the choice of a suitable F(n) isn’t quite obvious, and this is probably why F(n) = infinite is currently used.
Still, I think that a more restrictive choice would produce better results, and less frivolous voting.
A community is only as good as its constituents. I would hope that there are enough people around who like majority-wisdom-challenging insights, to offset this problem. “Insights” being the key word.
Are you aware that downvotes are already limited by karma? Limiting upvotes as well might have merit.
There probably needs to be a bias towards upvotes however, otherwise it will be very difficult to get significant positive karma.
See what I mean about the voting system being broken?
http://lesswrong.com/lw/1r9/shut_up_and_divide/1lxw
Currently voted −2 and below threshold.
Completely rational points of view that people find offensive cannot be expressed.
This is a site that is supposed to be about countering bias. Countering bias necessarily involves assaulting our emotional preconceptions which are the cause of falsity of thought. Yet, performing such assaults is actively discouraged.
Does that make this site Less Wrong, or More Wrong?
You’re getting downvoted for overconfidence, not for the content of your point of view.
The utilitarian point of view is that beyond some level of salary, more money has very small marginal utility to an average First World citizen, but would have a huge direct impact in utility on people who are starving in poor countries.
Your point is that the indirect impacts should also be considered, and that perhaps when they are taken into account the net utility increase isn’t so clear. The main indirect impact you identify is increasing dependency on the part of the recipients.
Your concern for the autonomy of these starving people is splendid, but the fact remains that without aid their lives will be full of suffering. Your position appears to be “good riddance”. You can’t fault people for being offended at the implied lack of compassion.
I suspect that your appeal for sympathy towards your position is doubly likely to fall on deaf ears as a result. Losing two karma points isn’t the end of the world, and does not constitute suppression. Stop complaining, and invest some effort in presenting your points of view more persuasively.
If denis is just being overconfident, couldn’t we just say “you’re being overconfident here, probably because you neglected to consider …” and reserve downvotes for trolls and nonsense (i.e., comments that clearly deserve to be hidden from view)?
Downvotes signal “would like to see fewer comments like this one”. This certainly applies to trolls and nonsense, but it feels appropriate to use the same signal for comments which, if the author had taken a little more time to compose, readers wouldn’t need to spend time correcting one way or another. The calculation I’ve seen at least once here (and I tend to agree with) is that you should value your readers’ time about 10x more than you value yours.
The appropriate thing to do if you receive downvotes and you’re neither a troll nor a crackpot seems to simply ask what’s wrong. Complaining only makes things worse. Complaining that the community is exhibiting censorship or groupthink makes things much worse.
Looking at the comment in question, Denis claims that charity “only” rewards bad things and discourages good ones. That is nonsense on its face, and it’s combined with mind-killing politics: ideological libertarianism about the immorality of paying taxes that benefit those labelled dysfunctional. I agree with Robin Hanson on this point.
Honestly, the system is doing exactly what it is supposed to be doing. If you think it is broken, I suspect you are expecting it to do something other than its purpose.
When I get frustrated by the karma system it is because I keep wanting more feedback than it provides. But this is a problem with me, not a problem with the system.
Then the solution to this “problem” would be to not care as much about feedback, or to want less feedback than you think you should get? Couldn’t it also be addressed by adding mechanisms for providing more feedback? I don’t see why the problem has to be on one particular side when solutions could be had either way.
ETA: I agree that denisbider appears to be injecting too much political thinking into his comments and calling it rationality, while not providing adequate support of his positions outside of said politics, and that the karma system is justifiably punishing him for it.
I can also see a potential for the current system to have a ‘delicate balance’ where changes trying to improve it could be offset by negative outcomes, but I don’t think that case has been made.
(1) I don’t see my comments as being political. If you perceive them as injecting politics, then I suspect it’s because you are used to hearing similar things in a more political environment. My comments are about reason, empathy, charity, value systems, and how they fit together.
(2) I am unable to substantiate my positions if people don’t respond. When people do respond, then I have some understanding of the differences between my viewpoint and theirs, and can substantiate. But I don’t believe it’s reasonable to expect all possible counter-arguments to be preempted in a comment of reasonable length.
(3) The “delicate balance” argument is specious—it is a form of bias in favor of what already exists. If we had a different system, then you would be calling that system a “delicate balance”.
1) Some of your earlier comments, especially those most negatively rated, set off all of my “political talking points” alarm bells. I note that many of your later comments aren’t so rated, and that you seem to be improving in your message-conveyance.
2) Your replies to replies seem to be going fairly well so far.
3) I agree that it is only potential. Thomblake posted a good link on that very topic, and it is also why I said the case had not been made, and put the phrase in quotes. However, calling it specious and saying I would agree with any system is exactly the sort of thing I was talking about. Just because it’s a potential bias doesn’t mean that it is necessarily in effect, nor that its effects are so strong that it shows things are obviously broken. We do a lot of probabilistic thinking around here...
Since we’re doing probabilistic thinking, I would assign a great probability to the current system being imperfect, simply because (1) it is the system with which the site was designed prior to developing experience, and (2) the system is observed to have faults.
These faults seem to be fixable by making voting costlier, prompting readers to invest more thought when they decide to vote. I don’t even expect that this would necessarily improve my karma, but I think it would increase thoughtfulness, decrease reactivity, and improve quality overall.
There should probably be a daily limit to how many comments people can make, too. I think it would encourage longer and more thoughtful comments rather than shorter and more reactive ones.
Patently false.
I disagree on both points.
Related: Reversal test
ETA: on second thought, more related: status quo bias
The frustration is the problem. If I keep getting frustrated than I keep expecting something other than reality.
I already upvoted you before reading this comment. It can take a little time for votes to settle. Also, you can set your threshold to a different value. The default is less than −2.
Thanks.
Oh, well now it’s −6. :))
Incidentally, I also up-voted your comment about how charity is unhelpful because it enables helplessness (even though I disagree) because I definitely think its valuable to have both arguments represented. However, I did expect your comment would be down-voted because my impression is that the group here has already considered Ayn Rand and disagree with her ideologically. I wouldn’t say they found your comment offensive … there’s just certain themes that are developed here more than others and that was an anti-theme note.
Do you think having certain ‘group themes’ is bad for rationality?
My observations aren’t Randian in origin. At least, I haven’t read her books; I even somewhat disapprove of her, from what I know of her idiosyncrasies as a person.
I do think that this is an important topic for this group to consider, because the community is about rationality. My observation is that many commenters seem to not be realizing the proper role of empathy in our emotional spectrum, and are trying to extent their empathy to their broader environment in ways that don’t make sense.
Also, if my anti-empathy comment is being downvoted because it isn’t part of a group theme, then the pro-empathy comments should be downvoted as well, but they are not. This indicates that people vote based on what they agree with, whether it is in context or not—and not based on what is in context, and/or provides food for thought.
This indicates you haven’t understood me: pro-empathy IS the theme here on Less Wrong. For a variety of reasons, this community tends to have ‘humanist goals’. This is considered to not be in conflict with rationality, because rationality is about achieving your goals, not choosing them. If you have a developed rational argument for why less charity would further humanist goals, there may be some interest, but much less interest if your argument seems based on a lack of humanist goals.
But the definition of “humanity” isn’t even coherent, and is actually incompatible with shades of gray that actually exist.
Until these fundamentals are thought out, there can be lots of hot air, but progress toward a goal cannot be made, as long as the goal is incoherent.
It seems to me that the type of humanism you’re talking about is based on an assumption that “other people are like me, and should therefore be just as valuable to me as I am”.
But other people, especially of different cultures and genetic heritage, have strikingly different values, strikingly different perceptions, different capacities to think, understand and create.
The differences are such that drawing the compassion line at the borders of the human race makes about as much sense as at any other arbitrary point in the biological spectrum.
I believe that, to be consistent in valuing empathy as a goal on its own, you have to have empathy with everything. I find that a laudable position. But the sad fact is, most of us here aren’t vegan, nor do even want to be. (I would be if most people were.)
People are selfish, and do not have empathy for everything. In fact, most people pretend to have empathy for the world as a whole, whereas in fact they only have empathy for the closest people around them, and perhaps not even them, when push comes to shove.
All that having been said, and the world being as selfish as it is, when you say that you’re a humanist, that you want to better the lot of other people, and that you contribute 50% of your income to charity (just as an example), you are basically saying that you’re a sucker, and that your empathic circuits are so out of control that you let other people exploit you.
Given that we are the way we are, I think a much more reasonable goal is to foster a world that shares our values, not to foster the existence of the arbitrary people who don’t share our values, but exist today.
People do to some extent vote based on what they agree with, and at least a few make no bones about that. But people also vote based on style. Based on if it feels like you are trying to learn and contribute to our learning or trying to appear superior and gain status. You look like the latter to me. And I think that you could be arguing the same things, in ways that are no less honest, and get positive karma if you just use different words.
I hear Socrates wasn’t popular either.
I’m no Socrates, but focusing on style instead of essence is incorrect.
Some of the best lessons I’ve learned were from people who were using a very blunt style.
I am not trying to appear superior, nor to gain status. If I wanted that, I would not be using a style which I know is likely to antagonize. I use a blunt style at the expense of my status and for the benefit of the message, not the other way around.
You’re saying some things which I’ve considered attempting to say but have self-censored to some extent due to expecting negative karma. You aren’t necessarily saying them in exactly the way I would have tried to put it, and I don’t necessarily agree with everything you’ve been saying but I broadly agree and have been upvoting most of your recent posts.
I agree with much of what he seems trying to convey. However, in many cases, the style is far too reminiscent of political talking points. Bluntness is useful insofar as it simplifies a message to its essential meaning. Talking points corrupt that process by injecting emotional appeals and loaded terms.
Perhaps I would know better to avoid that if I was more exposed to US culture, but I am originally from Europe and I tend to abhor political wars for their vacuousness, so perhaps I’m using words in ways that reminisce of politics inadvertently.
To remove the word “politics” from my description: You seem very sure of yourself, to the point where it seems you are not taking uncertainty into account where you should be. The views you express seem to be statements about the world, as if they were facts, when discussing things like utilitarian value of certain actions, when there are competing views on the topic, and you do a disservice to the discussion by failing to mention or explain why your opinions are better than the competing theories, or even acknowledging that they are opinions.
You don’t provide the evidence; you provide a statement of “fact” in isolation, sometimes going so far as to claim special knowledge and ask the audience to do things you know very well are not going to make for an easy or quick discussion (like, “Go spend a few years in Africa.”) I found that my alarms deactivated for your response to my comment that we think probabilistically, because the claims were testable and better labeled.
Points taken, thank you.
I was also moved by these concerns, and find comments sharing these general traits to degrade norms of discussion (e.g. clarity, use of evidence, distinguishing between normative and descriptive claims).
Perhaps we need a post setting out these norms clearly, so we can point newcomers to it?
I would very much welcome “a brief guide on how to get taken seriously by the LW community.”
A wiki entry would probably be the appropriate solution.
As with most things, it should probably be a top-level article first and a wiki entry second...
Thanks Matt. I generally try to take this role because I’m aware that the character traits that allow me to do this are somewhat rare, and that the role is valuable in balance.
I’m also aware of the need to improve my skills of getting the message across, but this takes time to develop.
There is some relevant discussion of the issue of how our empathy/instinctive moral reactions conflict with efficient markets in this interview with Hayek. The whole thing is worth watching but the most relevant part of the interview to this discussion starts at 45:25. Unfortunately Vimeo does not support links directly to a timestamp so you have to wait for the video to load before jumping to the relevant point.
ETA a particularly relevant quote: