Just like you need to give criminal suspects trials even if you think they are guilty, you need to treat ideas rationally even if you think their supporters don’t have a case.
If we had an infinite amount of time to spend treating all cases equally, then I would agree that all opinions should be argued out rather than ignored. Unfortunately, we only have limited time, and have to allocate it where we think it will do the greatest good. I think that a perfect rationalist would encounter ideas that aren’t worth the time to debate fully with their proponents. Unfortunately, we aren’t perfect rationalists, but thankfully we know that we aren’t perfect rationalists, and can try to compensate for our inadequacies. One such inadequacy is that we generally drastically overestimate how likely we are to be correct. In extreme cases, we even assign 100% certainty to things. The previous 2 sequences explain why this is a very bad idea. I think this is the sort of thing you were pointing out, and I would agree with you on that.
Even so, if I am extremely certain of something, and have good reason to believe that I’m not missing some subtle point, (such as the topic having been previously debated to death for the previous century), and if I apply a correction factor to compensate for the tendency I know I have to overestimate probabilities… if I do all this, and the probability of being correct still turns out to be quite high, with a narrow standard deviation, then I would indeed be inclined to waste little to no time on further discussion, and instead devote all my energies to fixing the problem by any means necessary.
Further, I suspect that you do the same to some degree. What issues do you spend more time arguing oven than solving? (maybe most political issues) What issues do you spend more time solving than arguing over? (perhaps you and your spouse spend more time actually doing housework then discussing the details of how to optimally divide labor) What issues do you spend 99% of your efforts fixing, rather than discussing the best fix? Aren’t there some issues where you sometimes refuse to “feed the trolls”, thus rejecting an opportunity to debate a topic that you are extremely sure of? Or do you make a policy of always replying to all such bait?
I’m not saying it would be a great heuristic to follow, especially for most people. I’m saying, that in an extremely narrow scope, it holds true. If you take the limit as p(correct) goes to infinity (Well, infinity if you are using decibels (∞db) or odds (∞:1), but 1 if you are using fractions (1/1 chance) and 100 if you are using percentages (100%). But I think decibels illustrates the point nicely.) eventually you have to start acting quite similar to how you would if you were 100% certain. That’s why Cromwell’s Rule exists; to protect us from rashly assigning such ludicrously high probabilities to anything. That overconfidence is the real problem.
I’m saying, that in an extremely narrow scope, it holds true.
Yes, it does. I would, for instance, put creationism in that category.
But I suspect the advice would be bad for most people, at least most people of the kind you see in Internet arguments, because people have a habit of saying that all sorts of things are really well established to all reasonable people and are only opposed by the deluded and by those with a stake in the problem. I wouldn’t, for instance, put capital punishment, or vegetarianism, or effective altruism, or immigration, in that category, but I’ve seen people treat all of those that way.
If we had an infinite amount of time to spend treating all cases equally, then I would agree that all opinions should be argued out rather than ignored. Unfortunately, we only have limited time, and have to allocate it where we think it will do the greatest good. I think that a perfect rationalist would encounter ideas that aren’t worth the time to debate fully with their proponents. Unfortunately, we aren’t perfect rationalists, but thankfully we know that we aren’t perfect rationalists, and can try to compensate for our inadequacies. One such inadequacy is that we generally drastically overestimate how likely we are to be correct. In extreme cases, we even assign 100% certainty to things. The previous 2 sequences explain why this is a very bad idea. I think this is the sort of thing you were pointing out, and I would agree with you on that.
Even so, if I am extremely certain of something, and have good reason to believe that I’m not missing some subtle point, (such as the topic having been previously debated to death for the previous century), and if I apply a correction factor to compensate for the tendency I know I have to overestimate probabilities… if I do all this, and the probability of being correct still turns out to be quite high, with a narrow standard deviation, then I would indeed be inclined to waste little to no time on further discussion, and instead devote all my energies to fixing the problem by any means necessary.
Further, I suspect that you do the same to some degree. What issues do you spend more time arguing oven than solving? (maybe most political issues) What issues do you spend more time solving than arguing over? (perhaps you and your spouse spend more time actually doing housework then discussing the details of how to optimally divide labor) What issues do you spend 99% of your efforts fixing, rather than discussing the best fix? Aren’t there some issues where you sometimes refuse to “feed the trolls”, thus rejecting an opportunity to debate a topic that you are extremely sure of? Or do you make a policy of always replying to all such bait?
I’m not saying it would be a great heuristic to follow, especially for most people. I’m saying, that in an extremely narrow scope, it holds true. If you take the limit as p(correct) goes to infinity (Well, infinity if you are using decibels (∞db) or odds (∞:1), but 1 if you are using fractions (1/1 chance) and 100 if you are using percentages (100%). But I think decibels illustrates the point nicely.) eventually you have to start acting quite similar to how you would if you were 100% certain. That’s why Cromwell’s Rule exists; to protect us from rashly assigning such ludicrously high probabilities to anything. That overconfidence is the real problem.
Yes, it does. I would, for instance, put creationism in that category.
But I suspect the advice would be bad for most people, at least most people of the kind you see in Internet arguments, because people have a habit of saying that all sorts of things are really well established to all reasonable people and are only opposed by the deluded and by those with a stake in the problem. I wouldn’t, for instance, put capital punishment, or vegetarianism, or effective altruism, or immigration, in that category, but I’ve seen people treat all of those that way.