Given a bunch of people who disagree, some of whom are actual experts and some of whom are selling snake oil, expertise yourself, there are some further quick-and-dirty heuristics you can use to tell which of the two groups is which. I think basically my suggestion can be best summarized at “look at argument structure”.
The real experts will likely spend a bunch of time correct popular misconceptions, which the fakers may subscribe to. By contrast, the fakers will generally not bother “correcting” the truth to their fakery, because why would they? They’re trying to sell to unreflective people who just believe the obvious-seeming thing; someone who actually bothered to read corrections to misconceptions at any point is likely too savvy to be their target audience.
Sometimes though you do get actual arguments. Fortunately, it’s easier to evaluate arguments than to determine truth oneself—of course, this is only any good if at least one of the parties is right! If everyone is wrong, heuristics like this will likely be no help. But in an experts-and-fakers situation, where one of the groups is right and the other pretty definitely wrong, you can often just use heuristics like “which side has arguments (that make some degree of sense) that the other side has no answer to (that makes any sense)?”. If we grant the assumption that one of the two sides is right, then it’s likely to be that one.
When you actually have a lot of back-and-forth arguing—as you might get in politics, or, as you might get in disputes between actual experts—the usefulness of this sort of thing can drop quickly, but if you’re just trying to sort out fakers from those with actual knowledge, I think it can work pretty well. (Although honestly, in a dispute between experts, I think the “left a key argument unanswered” is still a pretty big red flag.)
The real experts will likely spend a bunch of time correct popular misconceptions, which the fakers may subscribe to. By contrast, the fakers will generally not bother “correcting” the truth to their fakery, because why would they? They’re trying to sell to unreflective people who just believe the obvious-seeming thing; someone who actually bothered to read corrections to misconceptions at any point is likely too savvy to be their target audience.
Using this as a heuristic would often backfire on you as stated, because there’s a certain class of snake oil salesmen who use the conceit of correcting popular misconceptions to sell you on their own, unpopular misconceptions (and of course the product that fits them!). To me it looks like it’s exploiting the same kind of psychological mechanism that powers conspiracy theories, where the world is seen as full of hidden knowledge that “they” don’t want you to know because the misinformation is letting “them” get rich or whatever. And I think part of the reason this works is that it pattern matches to cases where it turned out someone who thought everyone else was wrong really was right, even if they are rare.
In short, you are more likely to be encountering a snake oil salesman than a Galileo or a Copernicus or a Darwin, so spending a lot of time “correcting” popular misconceptions is probably not a reliable signal of real competence and not fakery.
This is a good point (the redemption movement comes to mind as an example), but I think the cases I’m thinking of and the cases you’re describing look quite different in other details. Like, the bored/annoyed expert tired of having to correct basic mistakes, vs. the salesman who wants to initiate you into a new, exciting secret. But yeah, this is only a quick-and-dirty heuristic, and even then only good for distinguishing snake oil; it might not be a good idea to put too much weight on it, and it definitely won’t help you in a real dispute (“Wait, both sides are annoyed that the other is getting basic points wrong!”). As Eliezer put it—you can’t learn physics by studying psychology!
The real experts will likely spend a bunch of time correct popular misconceptions, which the fakers may subscribe to. By contrast, the fakers will generally not bother “correcting” the truth to their fakery, because why would they? They’re trying to sell to unreflective people who just believe the obvious-seeming thing; someone who actually bothered to read corrections to misconceptions at any point is likely too savvy to be their target audience.
This seems to rely on the fakes knowing they are fakes. I agree that is a problem and your heuristic useful but I think we (non-experts) are still stuck with the problem of separating out the real experts from those that mistakenly think they are also real experts. Those will likely attempt to correct the true security approach according to their mistaken premises and solutions. We’re still stuck with the problem that money doesn’t get the non-expert client too far.
Now, you’ve clearly been able to reduce the ratio of real solution to snake oil so moved the probabilities in your favor when throwing money at the problems but not sure just how far.
It seems like “real expert” is here used in two different senses. In the one sense there’s an expert is someone who spend their 10,000 hours of deliberate practice and developed strong opinions about what’s the right way to do things that they can articulate. That person will likely have convictions about what public misconceptions happen to be.
In the other sense being an expert is about an ability to produce certain quality outputs.
You can tell whether a person is an expert in the first sense by seeing whether they try to correct your misconception and have convictions about what’s the right way to act or whether the person just tells you what’s popular to say and what you want to hear.
I assume that is directed at my comment but not certain. The point I am making is that even after eliminating “the person just tells you what’s popular to say and what you want to hear.” you still have the problem of some of the remainder will be experts than understand the subtleties and details as they apply to your specific needs from those that don’t.
The heuristic about how they present their sales pitch are ” leaky filters” as the OP notes and I’m not entirely sure we understand how far they actually move the probabilities for actually getting the expert rather than the mediocre ( knows all the theory and terms and even has a good idea of how they all relate but just does not actually get the system as a whole or perhaps is just to lazy to do the work).
For those pushing these specific heuristics, is there any actual data we can look at to see how effective they are?
What I said above. Sorry, to be clear here, by “argument structure” I don’t mean the structure of the individual arguments but rather the overall argument—what rebuts what.
(Edit: Looks like I misread the parent comment and this fails to respond to it; see below.)
To be clear as well, the rhetorical point underneath my question is that I don’t think your heuristic is all that useful, and seems grounded in generalization from too few examples without searching for counterexamples. Rather than just attacking it directly like Gordon, I was trying to go up a meta-level, to just point at the difficulty of ‘buying’ methods of determining expertise, because you need to have expertise in distinguishing the market there.
(In general, when someone identifies a problem and you think you have a solution, it’s useful to consider whether your solution suffers from that problem on a different meta-level; sometimes you gain from sweeping the difficulty there, and sometimes you don’t.)
Oh, I see. I misread your comment then. Yes, I am assuming one already has the ability to discern the structure of an argument and doesn’t need to hire someone else to do that for you...
Given a bunch of people who disagree, some of whom are actual experts and some of whom are selling snake oil, expertise yourself, there are some further quick-and-dirty heuristics you can use to tell which of the two groups is which. I think basically my suggestion can be best summarized at “look at argument structure”.
The real experts will likely spend a bunch of time correct popular misconceptions, which the fakers may subscribe to. By contrast, the fakers will generally not bother “correcting” the truth to their fakery, because why would they? They’re trying to sell to unreflective people who just believe the obvious-seeming thing; someone who actually bothered to read corrections to misconceptions at any point is likely too savvy to be their target audience.
Sometimes though you do get actual arguments. Fortunately, it’s easier to evaluate arguments than to determine truth oneself—of course, this is only any good if at least one of the parties is right! If everyone is wrong, heuristics like this will likely be no help. But in an experts-and-fakers situation, where one of the groups is right and the other pretty definitely wrong, you can often just use heuristics like “which side has arguments (that make some degree of sense) that the other side has no answer to (that makes any sense)?”. If we grant the assumption that one of the two sides is right, then it’s likely to be that one.
When you actually have a lot of back-and-forth arguing—as you might get in politics, or, as you might get in disputes between actual experts—the usefulness of this sort of thing can drop quickly, but if you’re just trying to sort out fakers from those with actual knowledge, I think it can work pretty well. (Although honestly, in a dispute between experts, I think the “left a key argument unanswered” is still a pretty big red flag.)
Using this as a heuristic would often backfire on you as stated, because there’s a certain class of snake oil salesmen who use the conceit of correcting popular misconceptions to sell you on their own, unpopular misconceptions (and of course the product that fits them!). To me it looks like it’s exploiting the same kind of psychological mechanism that powers conspiracy theories, where the world is seen as full of hidden knowledge that “they” don’t want you to know because the misinformation is letting “them” get rich or whatever. And I think part of the reason this works is that it pattern matches to cases where it turned out someone who thought everyone else was wrong really was right, even if they are rare.
In short, you are more likely to be encountering a snake oil salesman than a Galileo or a Copernicus or a Darwin, so spending a lot of time “correcting” popular misconceptions is probably not a reliable signal of real competence and not fakery.
This is a good point (the redemption movement comes to mind as an example), but I think the cases I’m thinking of and the cases you’re describing look quite different in other details. Like, the bored/annoyed expert tired of having to correct basic mistakes, vs. the salesman who wants to initiate you into a new, exciting secret. But yeah, this is only a quick-and-dirty heuristic, and even then only good for distinguishing snake oil; it might not be a good idea to put too much weight on it, and it definitely won’t help you in a real dispute (“Wait, both sides are annoyed that the other is getting basic points wrong!”). As Eliezer put it—you can’t learn physics by studying psychology!
This seems to rely on the fakes knowing they are fakes. I agree that is a problem and your heuristic useful but I think we (non-experts) are still stuck with the problem of separating out the real experts from those that mistakenly think they are also real experts. Those will likely attempt to correct the true security approach according to their mistaken premises and solutions. We’re still stuck with the problem that money doesn’t get the non-expert client too far.
Now, you’ve clearly been able to reduce the ratio of real solution to snake oil so moved the probabilities in your favor when throwing money at the problems but not sure just how far.
It seems like “real expert” is here used in two different senses. In the one sense there’s an expert is someone who spend their 10,000 hours of deliberate practice and developed strong opinions about what’s the right way to do things that they can articulate. That person will likely have convictions about what public misconceptions happen to be.
In the other sense being an expert is about an ability to produce certain quality outputs.
You can tell whether a person is an expert in the first sense by seeing whether they try to correct your misconception and have convictions about what’s the right way to act or whether the person just tells you what’s popular to say and what you want to hear.
I assume that is directed at my comment but not certain. The point I am making is that even after eliminating “the person just tells you what’s popular to say and what you want to hear.” you still have the problem of some of the remainder will be experts than understand the subtleties and details as they apply to your specific needs from those that don’t.
The heuristic about how they present their sales pitch are ” leaky filters” as the OP notes and I’m not entirely sure we understand how far they actually move the probabilities for actually getting the expert rather than the mediocre ( knows all the theory and terms and even has a good idea of how they all relate but just does not actually get the system as a whole or perhaps is just to lazy to do the work).
For those pushing these specific heuristics, is there any actual data we can look at to see how effective they are?
Here’s another: probing into their argument structure a bit and checking if they can keep it from collapsing under its own weight.
https://www.lesswrong.com/posts/wyyfFfaRar2jEdeQK/entangled-truths-contagious-lies
And how does one distinguish snake oil salesmen and real experts when it comes to identifying argument structure and what it implies?
What I said above. Sorry, to be clear here, by “argument structure” I don’t mean the structure of the individual arguments but rather the overall argument—what rebuts what.
(Edit: Looks like I misread the parent comment and this fails to respond to it; see below.)
To be clear as well, the rhetorical point underneath my question is that I don’t think your heuristic is all that useful, and seems grounded in generalization from too few examples without searching for counterexamples. Rather than just attacking it directly like Gordon, I was trying to go up a meta-level, to just point at the difficulty of ‘buying’ methods of determining expertise, because you need to have expertise in distinguishing the market there.
(In general, when someone identifies a problem and you think you have a solution, it’s useful to consider whether your solution suffers from that problem on a different meta-level; sometimes you gain from sweeping the difficulty there, and sometimes you don’t.)
Oh, I see. I misread your comment then. Yes, I am assuming one already has the ability to discern the structure of an argument and doesn’t need to hire someone else to do that for you...
Probably the skill of discerning skill would be easier to learn than… every single skill you’re trying to discern.