That sounds likely to produce more effective argumentation rather than more effective reasoning. We’re essentially talking about reviving Rhetoric as a subject of study, either formally as a course or informally by way of lots of practice in the domain—and while that might include some inoculation against biases, it’s not at all clear whether that would dominate the effects of learning to leverage biases more subtly and effectively.
That sounds likely to produce more effective argumentation rather than more effective reasoning.
If you expose populations of gazelles to populations of cheetahs, you will get gazelles who are more effective cheetah-avoiders.
They will also be faster. Actually, really, objectively faster, as measured by someone who has a clock rather than a cheetah’s appetite as their metric.
The strongest techniques of argumentation — the ones that work against people who are also strong arguers — happen to be those that are in conformance with the mathematical rules of logic and evidence. That is why the ancients figured out syllogisms, and the less-ancients figured out probability, rules of evidence, symbolic logic, significance tests, the rule against hearsay, and so on.
(Evidence is not just “what seems convincing”, either. In a world where other people are trying to convince you of false things in order to take advantage of you, it is to your advantage to only be convinced by that which is actually true.)
This should not be surprising. If you want to beat others on a given field, you have to take advantage of the properties of that field — not just take advantage of naïve opponents. You do not become a chess master by studying the psychology of chess players; you study chess.
Evidence, at the level of a single argument in any field that isn’t subject to unambiguous experimental tests, is “what seems convincing”. That’s almost tautological. Careers in these fields—which make up the vast majority of talky fields out there, incidentally, and thus include the vast majority of arguments that a randomly selected member of the public will ever get into—aren’t made by being right in an abstract sense, but by convincing bosses, investors, and/or members of the public that you’re right. Avoiding being manipulated by your opponents is also important, but that has a lot less to do with formal logic and a lot more to do with social dynamics.
Out in the wild, I don’t see a whole lot of passion for the mathematical rules of logic and evidence in the practice of people whose job it is to argue with other strong debaters, i.e. lawyers and politicians. Same goes—in an admittedly more sophisticated way—for many branches of academia, which is theoretically a reference class made up entirely of people who’re well-informed about the rules of logic and evidence, so we’re not just dealing with a need to pander here.
What I do see is a lot of complex signaling behavior, a lot of sophistication around the selection and presentation of evidence that favors your side, and a lot of techniques for seeming, or for actually being, sincere in the presentation of your argument. Which is exactly what I’d expect. We’re not dealing with predator/prey dynamics here, where the criteria for fit and unfit are unambiguous and large chunks of fitness ultimately come down to physics; we’re dealing with a nasty incestuous free-for-all, where fitness is usually socially determined, using brains that’re built not for formal logic but for managing personal alliances. What do you think the cheapest route to winning an argument is going to be, most of the time?
Spending a lot of time arguing is very different from optimizing for being persuasive, or for only being persuaded by true arguments. Curi evidently spends a lot more time in argument than most members of this board, but I certainly wouldn’t say that it’s been helpful for him.
A gazelle that gets caught by a cheetah will die. A person who makes less sound points in a debate and refuses to change their mind can not only insist that they won the argument, they may even preserve more social status by doing so than acknowledging that they were wrong.
That sounds likely to produce more effective argumentation rather than more effective reasoning. We’re essentially talking about reviving Rhetoric as a subject of study, either formally as a course or informally by way of lots of practice in the domain—and while that might include some inoculation against biases, it’s not at all clear whether that would dominate the effects of learning to leverage biases more subtly and effectively.
At a guess, in fact, I’d say the reverse is true.
If you expose populations of gazelles to populations of cheetahs, you will get gazelles who are more effective cheetah-avoiders.
They will also be faster. Actually, really, objectively faster, as measured by someone who has a clock rather than a cheetah’s appetite as their metric.
The strongest techniques of argumentation — the ones that work against people who are also strong arguers — happen to be those that are in conformance with the mathematical rules of logic and evidence. That is why the ancients figured out syllogisms, and the less-ancients figured out probability, rules of evidence, symbolic logic, significance tests, the rule against hearsay, and so on.
(Evidence is not just “what seems convincing”, either. In a world where other people are trying to convince you of false things in order to take advantage of you, it is to your advantage to only be convinced by that which is actually true.)
This should not be surprising. If you want to beat others on a given field, you have to take advantage of the properties of that field — not just take advantage of naïve opponents. You do not become a chess master by studying the psychology of chess players; you study chess.
Evidence, at the level of a single argument in any field that isn’t subject to unambiguous experimental tests, is “what seems convincing”. That’s almost tautological. Careers in these fields—which make up the vast majority of talky fields out there, incidentally, and thus include the vast majority of arguments that a randomly selected member of the public will ever get into—aren’t made by being right in an abstract sense, but by convincing bosses, investors, and/or members of the public that you’re right. Avoiding being manipulated by your opponents is also important, but that has a lot less to do with formal logic and a lot more to do with social dynamics.
Out in the wild, I don’t see a whole lot of passion for the mathematical rules of logic and evidence in the practice of people whose job it is to argue with other strong debaters, i.e. lawyers and politicians. Same goes—in an admittedly more sophisticated way—for many branches of academia, which is theoretically a reference class made up entirely of people who’re well-informed about the rules of logic and evidence, so we’re not just dealing with a need to pander here.
What I do see is a lot of complex signaling behavior, a lot of sophistication around the selection and presentation of evidence that favors your side, and a lot of techniques for seeming, or for actually being, sincere in the presentation of your argument. Which is exactly what I’d expect. We’re not dealing with predator/prey dynamics here, where the criteria for fit and unfit are unambiguous and large chunks of fitness ultimately come down to physics; we’re dealing with a nasty incestuous free-for-all, where fitness is usually socially determined, using brains that’re built not for formal logic but for managing personal alliances. What do you think the cheapest route to winning an argument is going to be, most of the time?
Spending a lot of time arguing is very different from optimizing for being persuasive, or for only being persuaded by true arguments. Curi evidently spends a lot more time in argument than most members of this board, but I certainly wouldn’t say that it’s been helpful for him.
A gazelle that gets caught by a cheetah will die. A person who makes less sound points in a debate and refuses to change their mind can not only insist that they won the argument, they may even preserve more social status by doing so than acknowledging that they were wrong.