The stated goal of the community is to refine the art of human rationality. Unfortunately, rationality is an instrumental goal dependent upon the next-level-up or terminal goal. Most people, including me (initially, at least), assume that the next goal up is logical argumentation or discovery of how to reason better.
Most of the practices here are rational in terms of a specific individual’s goals (mostly in terms of maintaining beliefs) but are strictly contrary to good argumentation techniques. The number of ridiculous strawmen, arguments by authority, arguments by pointing to something long and rambling that has nothing to do with the initial argument, etc. is nothing short of overwhelming.
So the next goal up clearly isn’t rational argumentation. Assuming that it was was the mistake that I made in the post Irrational Upvotes (and why I subsequently retracted my stand that they were irrational). They are rational in relation to another goal. My error was in my assumption of the goal.
One of Eliezer’s main points is learning how to learn where you go wrong. This community is far worse at that than most I’ve seen. Y’all know how to argue/debate “logically” much better—but it’s normally to the purpose of retaining your views, not discovering where you might have gone wrong or where you might do better.
(I’ll cover 1 and 2 in subsequent comments—thanks for a high-quality response)
the number of ridiculous strawmen, arguments by authority, arguments by pointing to something long and rambling that has nothing to do with the initial argument, etc. is nothing short of overwhelming.
Some things to consider on these points (mostly because I have not noticed a prevalence of these issues)
Strawmen. If, at point X, Y looks like a strawman of a position, then at point Y, X will look like a strawman. I think. If that’s the case, it could be that many of us are at point X (LW rationality techniques, etc) and you are at point Y—making valid, credible arguments that we are countering with strawmen, as it were.
Arguments by authority. A hallmark of LessWrong is linking back to the sequences or to other posts; this could very easily look like we are saying “Eliezer said that’s not the case”. We aren’t; he just produced a very good explanation of why it isn’t the case, and it’s easier to link to that explanation rather than fumble through our own duplication. Another point is that the average LWer is far more capable of deferring to people they know to be often correct—their judgement as a Bayesian reasoner is itself evidence. This looks even more like argument from authority, but there are subtle differences.
Links to long, rambling segues that aren’t related. They are related, mostly. A combination of decompartmentalised thinking, skill with readily drawing analogies, and skill with (very) long inferential distances can produce relationships that seem bizarre or unlikely.
Lastly, this comment:
it’s normally to the purpose of retaining your views, not discovering where you might have gone wrong or where you might do better.
is definitely a concern for ALL LWers. I suspect you have stumbled onto a case analogous to theism: it is not the case that we wish to retain our atheism and therefore we argue to keep that view—we really, truly, have considered all the evidence and all the arguments, and we reject it on those grounds.
Has it got to the point where replying to this would be a violation of the ‘Do not feed the trolls’ convention? I had written a brief response but realize it may be better to ignore instead. But I will defer to the judgement of others here… if there are people who are still taking mwaser seriously then I will engage as appropriate.
Has it got to the point where replying to this would be a violation of the ‘Do not feed the trolls’ convention?
mwaser does not sound trollish here to me:
This community is far worse at that than most I’ve seen. Y’all know how to argue/debate “logically” much better—but it’s normally to the purpose of retaining your views, not discovering where you might have gone wrong or where you might do better.
There are users whom I think this describes well, including a few very active and usually-correct ones.
Good point. An important distinction. Trolling is entirely different in nature (and much more normatively objectionable). Although one way to create trolls is to feed the crackpots after midnight.
Yes, reducing the total amount of replies is a feature. Reducing the amount of good replies (and/or the diversity of replies) is a bug. Making it easy to make a mistake is a major bug. Too many people don’t bother to understand a post before they upvote or downvote it—they go with their initial prejudices. To form a coherent reply requires reading and understanding a post—assuming, of course, it is a post with substance (which is why I “complain” about substanceless posts).
Look at my most recent post here. It’s down to −5 and has exactly one pretty useless comment. I have gotten some really good criticism. I’ve also had posts where the only comments are endless repetitions of “he is obviously making this assumption”—regardless of how many times I say, “No, I don’t believe that. I am deriving my point from this other direction.” (Though, I must also admit that some of my original replies were not that clear, courteous, or cool-headed ;-)
It seems as though you don’t like karma systems. But surely they do much more good than bad. Poor karma has precious few consequences around here. Maybe there should be more—like throttling comments.
The stated goal of the community is to refine the art of human rationality. Unfortunately, rationality is an instrumental goal dependent upon the next-level-up or terminal goal. Most people, including me (initially, at least), assume that the next goal up is logical argumentation or discovery of how to reason better.
Most of the practices here are rational in terms of a specific individual’s goals (mostly in terms of maintaining beliefs) but are strictly contrary to good argumentation techniques. The number of ridiculous strawmen, arguments by authority, arguments by pointing to something long and rambling that has nothing to do with the initial argument, etc. is nothing short of overwhelming.
So the next goal up clearly isn’t rational argumentation. Assuming that it was was the mistake that I made in the post Irrational Upvotes (and why I subsequently retracted my stand that they were irrational). They are rational in relation to another goal. My error was in my assumption of the goal.
One of Eliezer’s main points is learning how to learn where you go wrong. This community is far worse at that than most I’ve seen. Y’all know how to argue/debate “logically” much better—but it’s normally to the purpose of retaining your views, not discovering where you might have gone wrong or where you might do better.
(I’ll cover 1 and 2 in subsequent comments—thanks for a high-quality response)
Some things to consider on these points (mostly because I have not noticed a prevalence of these issues)
Strawmen. If, at point X, Y looks like a strawman of a position, then at point Y, X will look like a strawman. I think. If that’s the case, it could be that many of us are at point X (LW rationality techniques, etc) and you are at point Y—making valid, credible arguments that we are countering with strawmen, as it were.
Arguments by authority. A hallmark of LessWrong is linking back to the sequences or to other posts; this could very easily look like we are saying “Eliezer said that’s not the case”. We aren’t; he just produced a very good explanation of why it isn’t the case, and it’s easier to link to that explanation rather than fumble through our own duplication. Another point is that the average LWer is far more capable of deferring to people they know to be often correct—their judgement as a Bayesian reasoner is itself evidence. This looks even more like argument from authority, but there are subtle differences.
Links to long, rambling segues that aren’t related. They are related, mostly. A combination of decompartmentalised thinking, skill with readily drawing analogies, and skill with (very) long inferential distances can produce relationships that seem bizarre or unlikely.
Lastly, this comment:
is definitely a concern for ALL LWers. I suspect you have stumbled onto a case analogous to theism: it is not the case that we wish to retain our atheism and therefore we argue to keep that view—we really, truly, have considered all the evidence and all the arguments, and we reject it on those grounds.
Has it got to the point where replying to this would be a violation of the ‘Do not feed the trolls’ convention? I had written a brief response but realize it may be better to ignore instead. But I will defer to the judgement of others here… if there are people who are still taking mwaser seriously then I will engage as appropriate.
mwaser does not sound trollish here to me:
There are users whom I think this describes well, including a few very active and usually-correct ones.
Not exactly, but I would support a “Do not feed the crackpots” convention.
Good point. An important distinction. Trolling is entirely different in nature (and much more normatively objectionable). Although one way to create trolls is to feed the crackpots after midnight.
Yes, reducing the total amount of replies is a feature. Reducing the amount of good replies (and/or the diversity of replies) is a bug. Making it easy to make a mistake is a major bug. Too many people don’t bother to understand a post before they upvote or downvote it—they go with their initial prejudices. To form a coherent reply requires reading and understanding a post—assuming, of course, it is a post with substance (which is why I “complain” about substanceless posts).
Look at my most recent post here. It’s down to −5 and has exactly one pretty useless comment. I have gotten some really good criticism. I’ve also had posts where the only comments are endless repetitions of “he is obviously making this assumption”—regardless of how many times I say, “No, I don’t believe that. I am deriving my point from this other direction.” (Though, I must also admit that some of my original replies were not that clear, courteous, or cool-headed ;-)
It seems as though you don’t like karma systems. But surely they do much more good than bad. Poor karma has precious few consequences around here. Maybe there should be more—like throttling comments.
Downvoted, because a comment suggesting “throttling comments” is exactly the kind of comment I would like to throttle. :P