I question how objective these objective criterion you’re talking about are. Usually when we judge someone’s intelligence, we aren’t actually looking at the results of an IQ test, so that’s subjective. Ditto rationality. And if you were really that concerned about education, you’d stop paying so much attention to Eliezer or people who have a bachelors’ degree at best and pay more attention to mainstream academics who actually have PhDs.
FWIW, actual heuristics I use to determine who’s worth paying attention to are
What I know of an individual’s track record of saying reasonable things.
Status of them and their ideas within mainstream academia (but because everyone knows about this heuristic, you have to watch out for people faking it.
Looking for other crackpot warning signs I’ve picked up over time, e.g. a non-expert claiming the mainstream academic view is not just wrong but obviously stupid, or being more interested in complaining that their views are being suppressed than in arguing for those views.
Which may not be great heuristics, but I’ll wager that they’re better than IQ (wager, in this case, being a figure of speech, because I don’t actually know how you’d adjudicate that bet).
It may be helpful, here, to quote what I hope will be henceforth known as the Litany of Hermione: “The thing that people forget sometimes, is that even though appearances can be misleading, they’re usually not.”
You’ve also succeeded in giving me second thoughts about being signed up for cryonics, on the grounds that I failed to consider how it might encourage terrible mental habits in others. For the record, it strikes me as quite possible that mainstream neuroscientists are entirely correct to be dismissive of cryonics—my biggest problem is that I’m fuzzy on what exactly they think about cryonics (more here).
Your heuristics are, in my opinion, too conservative or not strong enough.
Track record of saying reasonable things once again seems to put the burden of decision on your subjective feelings and so rule out paying attention to people you disagree with. If you’re a creationist, you can rule out paying attention to Richard Dawkins, because if he’s wrong about God existing, about the age of the Earth, and about homosexuality being okay, how can you ever expect him to be right about evolution? If you’re anti-transhumanism, you can rule out cryonicists because they tend to say lots of other unreasonable things like that computers will be smarter than humans, or that there can be “intelligence explosions”, or that you can upload a human brain.
Status within mainstream academia is a really good heuristic, and this is part of what I mean when I say I use education as a heuristic. Certainly to a first approximation, before investigating a field, you should just automatically believe everything the mainstream academics believe. But then we expect mainstream academia to be wrong in a lot of cases—you bring up the case of mainstream academic philosophy, and although I’m less certain than you are there, I admit I am very skeptical of them. So when we say we need heuristics to find ideas to pay attention to, I’m assuming we’ve already started by assuming mainstream academia is always right, and we’re looking for which challenges to them we should pay attention to. I agree that “challenges the academics themselves take seriously” is a good first step, but I’m not sure that would suffice to discover the critique of mainstream philosophy. And it’s very little help at all in fields like politics.
The crackpot warning signs are good (although it’s interesting how often basically correct people end up displaying some of them because they get angry at having their ideas rejected and so start acting out, and it also seems like people have a bad habit of being very sensitive to crackpot warning signs the opposing side displays and very obtuse to those their own side displays). But once again, these signs are woefully inadequate. Plantinga doesn’t look a bit like a crackpot.
You point out that “Even though appearances can be misleading, they’re usually not.” I would agree, but suggest you extend this to IQ and rationality. We are so fascinated by the man-bites-dog cases of very intelligent people believing stupid things that it’s hard to remember that stupid things are still much, much likelier to be believed by stupid people.
(possible exceptions in politics, but politics is a weird combination of factual and emotive claims, and even the wrong things smart people believe in politics are in my category of “deserve further investigation and charitable treatment”.)
You are right that I rarely have the results of an IQ test (or Stanovich’s rationality test) in front of me. So when I say I judge people by IQ, I think I mean something like what you mean when you say “a track record of making reasonable statements”, except basing “reasonable statements” upon “statements that follow proper logical form and make good arguments” rather than ones I agree with.
So I think it is likely that we both use a basket of heuristics that include education, academic status, estimation of intelligence, estimation of rationality, past track record, crackpot warning signs, and probably some others.
I’m not sure whether we place different emphases on those, or whether we’re using about the same basket but still managing to come to different conclusions due to one or both of us being biased.
Has anyone noticed that, given the fact that most of the material on this site is esemtially about philosophy, “academic philosophy sucks” is a Crackpot Warning Sign, ie “don’t listen to the hidebound establishment”.
So I normally defend the “trust the experts” position, and I went to grad school for philosophy, but… I think philosophy may be an area where “trust the experts” mostly doesn’t work, simply because with a few exceptions the experts don’t agree on anything. (Fuller explanation, with caveats, here.)
Also, from the same background, it is striking to me that a lot of the criticisms Less Wrong people make of philosophers are the same as the criticisms philosophers make of one another. I can’t really think of a case where Less Wrong stakes out positions that are almost universally rejected by mainstream philosophers. And not just because philosophers disagree so much, though that’s also true, of course; it seems rather that Less Wrong people greatly exaggerate how different they are and how much they disagree with the philosophical mainstream, to the extent that any such thing exists (again, a respect in which their behavior resembles how philosophers treat one another).
Since there is no consensus among philosophers, respecting philosophy is about respecting the process. The negative .claims LW makes about philosophy are indeed similar to the negative claims philosophy makes about itself. LW also makes the positive claim that it has a better, faster method than philosophy but in fact just has a truncated version of the same method.
But Alexander misunderstands me when he says I accuse Yudkowsky “of being against publicizing his work for review or criticism.” He’s willing to publish it–but only to enlighten us lesser rationalists. He doesn’t view it as a necessary part of checking whether his views are actually right. That means rejecting the social process of science. That’s a problem.
Or, as I like to put it, if you half bake your bread, then you get your bread quicker...but its half baked,
You might be interested in this article and this sequence (in particular, the first post of that sequence). “Academic philosophy sucks” is a Crackpot Warning Sign because of the implied brevity. A measured, in-depth criticism is one thing; a smear is another.
Track record of saying reasonable things once again seems to put the burden of decision on your subjective feelings and so rule out paying attention to people you disagree with.
Counterexample: your own investigation of natural law theology. Another: your investigation of the Alzheimer’s bacterium hypothesis. I’d say your own intellectual history nicely demonstrates just how to pull off the seemingly impossible feat of detecting reasonable people you disagree with.
But then we expect mainstream academia to be wrong in a lot of cases—you bring up the case of mainstream academic philosophy, and although I’m less certain than you are there, I admit I am very skeptical of them.
With philosophy, I think the easiest, most important thing for non-experts to notice is that (with a few arguable exceptions are independently pretty reasonable) philosophers basically don’t agree on anything. In the case of e.g. Plantinga specifically, non-experts can notice few other philosophers think the modal ontological argument accomplishes anything.
The crackpot warning signs are good (although it’s interesting how often basically correct people end up displaying some of them because they get angry at having their ideas rejected and so start acting out...
Examples?
We are so fascinated by the man-bites-dog cases of very intelligent people believing stupid things that it’s hard to remember that stupid things are still much, much likelier to be believed by stupid people.
(possible exceptions in politics, but politics is a weird combination of factual and emotive claims, and even the wrong things smart people believe in politics are in my category of “deserve further investigation and charitable treatment”.)
I don’t think “smart people saying stupid things” reaches anything like man-bites-dog levels of surprisingness. Not only do you have examples from politics, but also from religion. According to a recent study, a little over a third of academics claim that “I know God really exists and I have no doubts about it,” which is maybe less than the general public but still a sizeable minority (and the same study found many more academics take some sort of weaker pro-religion stance). And in my experience, even highly respected academics, when they try to defend religion, routinely make juvenile mistakes that make Plantinga look good by comparison. (Remember, I used Plantinga in the OP not because he makes the dumbest mistakes per se but as an example of how bad arguments can signal high intelligence.)
So when I say I judge people by IQ, I think I mean something like what you mean when you say “a track record of making reasonable statements”, except basing “reasonable statements” upon “statements that follow proper logical form and make good arguments” rather than ones I agree with.
Proper logical form comes cheap, just add a premise which says, “if everything I’ve said so far is true, then my conclusion is true.” “Good arguments” is much harder to judge, and seems to defeat the purpose of having a heuristic for deciding who to treat charitably: if I say “this guy’s arguments are terrible,” and you say, “you should read those arguments more charitably,” it doesn’t do much good for you to defend that claim by saying, “well, he has a track record of making good arguments.”
I agree that disagreement among philosophers is a red flag that we should be looking for alternative positions.
But again, I don’t feel like that’s strong enough enough. Nutrition scientists disagree. Politicians and political scientists disagree. Psychologists and social scientists disagree. Now that we know we can be looking for high-quality contrarians in those fields, how do we sort out the high-quality ones from the lower-quality ones?
Examples?
Well, take Barry Marshall. Became convinced that ulcers were caused by a stomach bacterium (he was right; later won the Nobel Prize). No one listened to him. He said that “my results were disputed and disbelieved, not on the basis of science but because they simply could not be true...if I was right, then treatment for ulcer disease would be revolutionized. It would be simple, cheap and it would be a cure. It seemed to me that for the sake of patients this research had to be fast tracked. The sense of urgency and frustration with the medical community was partly due to my disposition and age.”
So Marshall decided since he couldn’t get anyone to fund a study, he would study it on himself, drank a serum of bacteria, and got really sick.
Then due to a weird chain of events, his results ended up being published in the Star, a tabloid newspaper that by his own admission “talked about alien babies being adopted by Nancy Reagan”, before they made it into legitimate medical journals.
I feel like it would be pretty easy to check off a bunch of boxes on any given crackpot index...”believes the establishment is ignoring him because of their biases”, “believes his discovery will instantly solve a centuries-old problem with no side effects”, “does his studies on himself”, “studies get published in tabloid rather than journal”, but these were just things he naturally felt or had to do because the establishment wouldn’t take him seriously and he couldn’t do things “right”.
I don’t think “smart people saying stupid things” reaches anything like man-bites-dog levels of surprisingness. Not only do you have examples from politics, but also from religion. According to a recent study, a little over a third of academics claim that “I know God really exists and I have no doubts about it,” which is maybe less than the general public but still a sizeable minority
I think it is much much less than the general public, but I don’t think that has as much to do with IQ per se as with academic culture. But although I agree that the finding that IQ isn’t a stronger predictor of correct beliefs than it is is interesting, I am still very surprised that you don’t seem to think it matters at all (or at least significantly). What if we switched gears? Agreeing that the fact that a contrarian theory is invented or held by high IQ people is no guarantee of its success, can we agree that the fact that a contrarian theory is invented and mostly held by low IQ people is a very strong strike against it?
Proper logical form comes cheap, just add a premise which says, “if everything I’ve said so far is true, then my conclusion is true.”
Proper logical form comes cheap, but a surprising number of people don’t bother even with that. Do you frequently see people appending “if everything I’ve said so far is true, then my conclusion is true” to screw with people who judge arguments based on proper logical form?
Nutrition scientists disagree. Politicians and political scientists disagree. Psychologists and social scientists disagree. Now that we know we can be looking for high-quality contrarians in those fields, how do we sort out the high-quality ones from the lower-quality ones?
What’s your proposal for how to do that, aside from just evaluating the arguments the normal way? Ignore the politicians, and we’re basically talking about people who all have PhDs, so education can’t be the heuristic. You also proposed IQ and rationality, but admitted we aren’t going to have good ways to measure them directly, aside from looking for “statements that follow proper logical form and make good arguments.” I pointed out that “good arguments” is circular if we’re trying to decide who to read charitably, and you had no response to that.
That leaves us with “proper logical form,” about which you said:
Proper logical form comes cheap, but a surprising number of people don’t bother even with that. Do you frequently see people appending “if everything I’ve said so far is true, then my conclusion is true” to screw with people who judge arguments based on proper logical form?
In response to this, I’ll just point out that this is not an argument in proper logical form. It’s a lone assertion followed by a rhetorical question.
I question how objective these objective criterion you’re talking about are. Usually when we judge someone’s intelligence, we aren’t actually looking at the results of an IQ test, so that’s subjective. Ditto rationality. And if you were really that concerned about education, you’d stop paying so much attention to Eliezer or people who have a bachelors’ degree at best and pay more attention to mainstream academics who actually have PhDs.
FWIW, actual heuristics I use to determine who’s worth paying attention to are
What I know of an individual’s track record of saying reasonable things.
Status of them and their ideas within mainstream academia (but because everyone knows about this heuristic, you have to watch out for people faking it.
Looking for other crackpot warning signs I’ve picked up over time, e.g. a non-expert claiming the mainstream academic view is not just wrong but obviously stupid, or being more interested in complaining that their views are being suppressed than in arguing for those views.
Which may not be great heuristics, but I’ll wager that they’re better than IQ (wager, in this case, being a figure of speech, because I don’t actually know how you’d adjudicate that bet).
It may be helpful, here, to quote what I hope will be henceforth known as the Litany of Hermione: “The thing that people forget sometimes, is that even though appearances can be misleading, they’re usually not.”
You’ve also succeeded in giving me second thoughts about being signed up for cryonics, on the grounds that I failed to consider how it might encourage terrible mental habits in others. For the record, it strikes me as quite possible that mainstream neuroscientists are entirely correct to be dismissive of cryonics—my biggest problem is that I’m fuzzy on what exactly they think about cryonics (more here).
Your heuristics are, in my opinion, too conservative or not strong enough.
Track record of saying reasonable things once again seems to put the burden of decision on your subjective feelings and so rule out paying attention to people you disagree with. If you’re a creationist, you can rule out paying attention to Richard Dawkins, because if he’s wrong about God existing, about the age of the Earth, and about homosexuality being okay, how can you ever expect him to be right about evolution? If you’re anti-transhumanism, you can rule out cryonicists because they tend to say lots of other unreasonable things like that computers will be smarter than humans, or that there can be “intelligence explosions”, or that you can upload a human brain.
Status within mainstream academia is a really good heuristic, and this is part of what I mean when I say I use education as a heuristic. Certainly to a first approximation, before investigating a field, you should just automatically believe everything the mainstream academics believe. But then we expect mainstream academia to be wrong in a lot of cases—you bring up the case of mainstream academic philosophy, and although I’m less certain than you are there, I admit I am very skeptical of them. So when we say we need heuristics to find ideas to pay attention to, I’m assuming we’ve already started by assuming mainstream academia is always right, and we’re looking for which challenges to them we should pay attention to. I agree that “challenges the academics themselves take seriously” is a good first step, but I’m not sure that would suffice to discover the critique of mainstream philosophy. And it’s very little help at all in fields like politics.
The crackpot warning signs are good (although it’s interesting how often basically correct people end up displaying some of them because they get angry at having their ideas rejected and so start acting out, and it also seems like people have a bad habit of being very sensitive to crackpot warning signs the opposing side displays and very obtuse to those their own side displays). But once again, these signs are woefully inadequate. Plantinga doesn’t look a bit like a crackpot.
You point out that “Even though appearances can be misleading, they’re usually not.” I would agree, but suggest you extend this to IQ and rationality. We are so fascinated by the man-bites-dog cases of very intelligent people believing stupid things that it’s hard to remember that stupid things are still much, much likelier to be believed by stupid people.
(possible exceptions in politics, but politics is a weird combination of factual and emotive claims, and even the wrong things smart people believe in politics are in my category of “deserve further investigation and charitable treatment”.)
You are right that I rarely have the results of an IQ test (or Stanovich’s rationality test) in front of me. So when I say I judge people by IQ, I think I mean something like what you mean when you say “a track record of making reasonable statements”, except basing “reasonable statements” upon “statements that follow proper logical form and make good arguments” rather than ones I agree with.
So I think it is likely that we both use a basket of heuristics that include education, academic status, estimation of intelligence, estimation of rationality, past track record, crackpot warning signs, and probably some others.
I’m not sure whether we place different emphases on those, or whether we’re using about the same basket but still managing to come to different conclusions due to one or both of us being biased.
Has anyone noticed that, given the fact that most of the material on this site is esemtially about philosophy, “academic philosophy sucks” is a Crackpot Warning Sign, ie “don’t listen to the hidebound establishment”.
So I normally defend the “trust the experts” position, and I went to grad school for philosophy, but… I think philosophy may be an area where “trust the experts” mostly doesn’t work, simply because with a few exceptions the experts don’t agree on anything. (Fuller explanation, with caveats, here.)
Also, from the same background, it is striking to me that a lot of the criticisms Less Wrong people make of philosophers are the same as the criticisms philosophers make of one another. I can’t really think of a case where Less Wrong stakes out positions that are almost universally rejected by mainstream philosophers. And not just because philosophers disagree so much, though that’s also true, of course; it seems rather that Less Wrong people greatly exaggerate how different they are and how much they disagree with the philosophical mainstream, to the extent that any such thing exists (again, a respect in which their behavior resembles how philosophers treat one another).
Since there is no consensus among philosophers, respecting philosophy is about respecting the process. The negative .claims LW makes about philosophy are indeed similar to the negative claims philosophy makes about itself. LW also makes the positive claim that it has a better, faster method than philosophy but in fact just has a truncated version of the same method.
As Hallquist notes elsewhere
But Alexander misunderstands me when he says I accuse Yudkowsky “of being against publicizing his work for review or criticism.” He’s willing to publish it–but only to enlighten us lesser rationalists. He doesn’t view it as a necessary part of checking whether his views are actually right. That means rejecting the social process of science. That’s a problem.
Or, as I like to put it, if you half bake your bread, then you get your bread quicker...but its half baked,
If what philosophers specialise in clarifying questions, they can trusted to get the question right.
A typical failure mode of amateur philosophy is to substitute easier questions for harder ones.
You might be interested in this article and this sequence (in particular, the first post of that sequence). “Academic philosophy sucks” is a Crackpot Warning Sign because of the implied brevity. A measured, in-depth criticism is one thing; a smear is another.
Read them ,not generally impressed.
Counterexample: your own investigation of natural law theology. Another: your investigation of the Alzheimer’s bacterium hypothesis. I’d say your own intellectual history nicely demonstrates just how to pull off the seemingly impossible feat of detecting reasonable people you disagree with.
With philosophy, I think the easiest, most important thing for non-experts to notice is that (with a few arguable exceptions are independently pretty reasonable) philosophers basically don’t agree on anything. In the case of e.g. Plantinga specifically, non-experts can notice few other philosophers think the modal ontological argument accomplishes anything.
Examples?
I don’t think “smart people saying stupid things” reaches anything like man-bites-dog levels of surprisingness. Not only do you have examples from politics, but also from religion. According to a recent study, a little over a third of academics claim that “I know God really exists and I have no doubts about it,” which is maybe less than the general public but still a sizeable minority (and the same study found many more academics take some sort of weaker pro-religion stance). And in my experience, even highly respected academics, when they try to defend religion, routinely make juvenile mistakes that make Plantinga look good by comparison. (Remember, I used Plantinga in the OP not because he makes the dumbest mistakes per se but as an example of how bad arguments can signal high intelligence.)
Proper logical form comes cheap, just add a premise which says, “if everything I’ve said so far is true, then my conclusion is true.” “Good arguments” is much harder to judge, and seems to defeat the purpose of having a heuristic for deciding who to treat charitably: if I say “this guy’s arguments are terrible,” and you say, “you should read those arguments more charitably,” it doesn’t do much good for you to defend that claim by saying, “well, he has a track record of making good arguments.”
I agree that disagreement among philosophers is a red flag that we should be looking for alternative positions.
But again, I don’t feel like that’s strong enough enough. Nutrition scientists disagree. Politicians and political scientists disagree. Psychologists and social scientists disagree. Now that we know we can be looking for high-quality contrarians in those fields, how do we sort out the high-quality ones from the lower-quality ones?
Well, take Barry Marshall. Became convinced that ulcers were caused by a stomach bacterium (he was right; later won the Nobel Prize). No one listened to him. He said that “my results were disputed and disbelieved, not on the basis of science but because they simply could not be true...if I was right, then treatment for ulcer disease would be revolutionized. It would be simple, cheap and it would be a cure. It seemed to me that for the sake of patients this research had to be fast tracked. The sense of urgency and frustration with the medical community was partly due to my disposition and age.”
So Marshall decided since he couldn’t get anyone to fund a study, he would study it on himself, drank a serum of bacteria, and got really sick.
Then due to a weird chain of events, his results ended up being published in the Star, a tabloid newspaper that by his own admission “talked about alien babies being adopted by Nancy Reagan”, before they made it into legitimate medical journals.
I feel like it would be pretty easy to check off a bunch of boxes on any given crackpot index...”believes the establishment is ignoring him because of their biases”, “believes his discovery will instantly solve a centuries-old problem with no side effects”, “does his studies on himself”, “studies get published in tabloid rather than journal”, but these were just things he naturally felt or had to do because the establishment wouldn’t take him seriously and he couldn’t do things “right”.
I think it is much much less than the general public, but I don’t think that has as much to do with IQ per se as with academic culture. But although I agree that the finding that IQ isn’t a stronger predictor of correct beliefs than it is is interesting, I am still very surprised that you don’t seem to think it matters at all (or at least significantly). What if we switched gears? Agreeing that the fact that a contrarian theory is invented or held by high IQ people is no guarantee of its success, can we agree that the fact that a contrarian theory is invented and mostly held by low IQ people is a very strong strike against it?
Proper logical form comes cheap, but a surprising number of people don’t bother even with that. Do you frequently see people appending “if everything I’ve said so far is true, then my conclusion is true” to screw with people who judge arguments based on proper logical form?
The extent to which science rejected the ulcer bacterium theory has been exaggerated. (And that article also addresses some quotes from Marshall himself which don’t exactly match up with the facts.)
What’s your proposal for how to do that, aside from just evaluating the arguments the normal way? Ignore the politicians, and we’re basically talking about people who all have PhDs, so education can’t be the heuristic. You also proposed IQ and rationality, but admitted we aren’t going to have good ways to measure them directly, aside from looking for “statements that follow proper logical form and make good arguments.” I pointed out that “good arguments” is circular if we’re trying to decide who to read charitably, and you had no response to that.
That leaves us with “proper logical form,” about which you said:
In response to this, I’ll just point out that this is not an argument in proper logical form. It’s a lone assertion followed by a rhetorical question.