I think the way you use “conspiracy theory” is quite a good one*, but somewhat non-standard. In particular, you state that ideas described as “conspiracy theories” are sometimes correct. I think Brillyant (to whom I was originally replying) gives a much more standard description when he calls them “ludicrous” and “absurd.” For example, wiki states that the phrase:
has acquired a derogatory meaning, implying a paranoid tendency to see the influence of some malign covert agency in events. The term is often used to dismiss claims that the critic deems ridiculous, misconceived, paranoid, unfounded, outlandish, or irrational.
If the ordinary connotation of “conspiracy theory” was “low-probability hypothesis involving a conspiracy” I would not have objected to its use.
*although I think that the “evil” part needs work.
It looks to me as if Brillyant is using the term to mean something close to “ludicrous, absurd theory involving a conspiracy”. I remark firstly that this isn’t so far from “low-probability hypothesis involving a conspiracy” and secondly that it’s entirely possible that Brillyant hasn’t sat down and thought through exactly what shades of meaning s/he attaches to the term “conspiracy theory”, and that on further reflection s/he would define that term in a way that clearly doesn’t amount to “theory I want to make fun of”.
I appreciate that you’re concerned about equivocation where someone effectively argues “this is a conspiracy theory (= theory with a conspiracy in), therefore it should be treated like a conspiracy theory (= ludicrous absurd theory with a conspiracy in)”, but I don’t see anyone doing that in this thread and given how firmly established the term is I don’t think there’s much mileage in trying to prevent it by declaring that “conspiracy theory” simply means “theory with a conspiracy in”.
(In particular I don’t think Brillyant is engaging in any such equivocation. Rather, I think s/he is, or would be after more explicit reflection, saying something like this: 1. People like to believe in conspiracies. 2. Therefore, the fact that a theory is believed by quite a lot of people is less evidence when the theory features a conspiracy than it normally would be. 3. So when someone offers up a theory that isn’t terribly plausible on its face and that involves a conspiracy, my initial estimate is that it’s unlikely to be true; the best explanation of the fact that I’m being invited to consider it is that its advocates have fallen prey to their inbuilt liking for theories involving conspiracies. -- This doesn’t oblige Brillyant to disbelieve every theory with a conspiracy in, because some actually have good evidence or are highly plausible for other reasons. Those tend not to be the ones labelled “conspiracy theory”.)
I think you misunderstand my concern; perhaps I have not been clear enough. I am not so much worried about equivocation, as I am worried by precisely the 3-step process which you describe. And I am particularly worried about people going through that process, labelling something a “conspiracy theory,” then the theory turns out to be true, and they never reassess their premises.
Let’s restate your process in more neutral language.
For reasons of specialisation, partial information, etc, I treat the fact that lots of people believe in a theory as partial evidence in its favour.
Some people have a higher prior than me for the existence of conspiracies.
Therefore if a theory involving a conspiracy is believed by quite a lot of people, it may be that this belief is due to their higher prior for conspiracies, not any special knowledge or expertise that I need to defer to.
Therefore I treat their belief in the theory as less evidence than normal, on the basis that if I had the evidence/expertise/etc that they do, I would be less likely than them to conclude that there is a conspiracy.
So if someone offers an implausible-seeming theory to me involving a conspiracy, I discount it and conclude that its advocates just have a high prior for conspiracies.
Suddenly, this doesn’t look like a sound epistemological process at all. Steps (1) and (2) are fine, but (3), (4) and (5) go increasingly off the rails. It looks like you are deliberately shielding your anti-conspiracies prior, by discounting (even beyond their initial level of plausibility) theories that might challenge it. And if, on those occasions that a conspiracy is eventually proven, you refuse to update your prior on the likelihood of conspiracies (by insisting that such-and-such a theory doesn’t really count as a conspiracy theory, even though, at the time, you were happy to label it as such), then I would say that the process has become truly pathological, just as much as that of a “conspiracy theorist.”
Consider: why do some people have that higher prior? Mightn’t that higher prior be itself part of their tacit knowledge and expertise—in the same way that a doctor’s prior on the cause of a set of symptoms is not because he ‘likes’ to diagnose people with tuberculosis, but due to his own updates on past experience to which you are not privy. Aren’t we doing precisely the wrong thing by discounting the theory in response to the prior?
None of this means that you should become a 9-11 Truther, of course. But consider the 1999 Moscow bombings. I don’t have any particular evidence about the events, but there’s a plausible case that they were an FSB conspiracy. Shouldn’t that make you more willing to believe the hypothetical in the grandparent than otherwise? In my own experience, people who are most likely to believe in conspiracy theories are those who had their formative experiences in dictatorial countries where there really are lots of conspiracies—and so they subsequently see them everywhere. But by symmetry, it follows that those of us brought up in the West will be too reluctant to see conspiracies elsewhere.
(I’m not sure that #2 is the right formulation. A lot of people don’t think in terms sufficiently close to Bayesian inference that talking about their “priors” really makes sense. I’m not sure this is more than nit-picking, though.)
I agree that #3,4,5 “go increasingly off the rails” but I think what goes off the rails is your description, as much as the actual mental process it aims to describe. Specifically, I think you are making the following claims and blaming them on the term “conspiracy theory”:
That when someone thinks something is a “conspiracy theory” they discount it not only in the sense of thinking it less likely than they otherwise would have, but in the stronger sense of dismissing it completely.
That they are then immune to further evidence that might (if they were rational) lead them to accept the theory after all.
That if the theory eventually turns out to have been right, they don’t update their estimate for how much to discount theories on account of being suspiciously conspiracy-based.
Now, I dare say many people do do just those things. After all, many people do all kinds of highly irrational things. But unless I’m badly misreading you, you are claiming specifically that I and Brillyant do them, and you are laying much of the blame for this on the usage of the term “conspiracy theory”, and I think both parts of this are wrong.
Mightn’t that higher prior be itself part of their tacit knowledge and expertise
Yup. But the answer to that question is always yes, and therefore tells us nothing. (Mightn’t a creationist’s higher prior on the universe being only 6000 years old be part of their tacit knowledge and expertise? It might be, but I wouldn’t bet on it.)
But by symmetry, it follows that those of us brought up in the West will be too reluctant to see conspiracies elsewhere.
I don’t think the symmetry is quite there. People brought up in totalitarian countries who then move to liberal democracies see too many conspiracies. No doubt people brought up in liberal democracies who then move to totalitarian countries see too few, but it could still be that people brought up in totalitarian countries who stay there and people brought up in liberal democracies who stay there both see approximately the right number of conspiracies.
Thank you for an interesting reply.
I think the way you use “conspiracy theory” is quite a good one*, but somewhat non-standard. In particular, you state that ideas described as “conspiracy theories” are sometimes correct. I think Brillyant (to whom I was originally replying) gives a much more standard description when he calls them “ludicrous” and “absurd.” For example, wiki states that the phrase:
If the ordinary connotation of “conspiracy theory” was “low-probability hypothesis involving a conspiracy” I would not have objected to its use.
*although I think that the “evil” part needs work.
It looks to me as if Brillyant is using the term to mean something close to “ludicrous, absurd theory involving a conspiracy”. I remark firstly that this isn’t so far from “low-probability hypothesis involving a conspiracy” and secondly that it’s entirely possible that Brillyant hasn’t sat down and thought through exactly what shades of meaning s/he attaches to the term “conspiracy theory”, and that on further reflection s/he would define that term in a way that clearly doesn’t amount to “theory I want to make fun of”.
I appreciate that you’re concerned about equivocation where someone effectively argues “this is a conspiracy theory (= theory with a conspiracy in), therefore it should be treated like a conspiracy theory (= ludicrous absurd theory with a conspiracy in)”, but I don’t see anyone doing that in this thread and given how firmly established the term is I don’t think there’s much mileage in trying to prevent it by declaring that “conspiracy theory” simply means “theory with a conspiracy in”.
(In particular I don’t think Brillyant is engaging in any such equivocation. Rather, I think s/he is, or would be after more explicit reflection, saying something like this: 1. People like to believe in conspiracies. 2. Therefore, the fact that a theory is believed by quite a lot of people is less evidence when the theory features a conspiracy than it normally would be. 3. So when someone offers up a theory that isn’t terribly plausible on its face and that involves a conspiracy, my initial estimate is that it’s unlikely to be true; the best explanation of the fact that I’m being invited to consider it is that its advocates have fallen prey to their inbuilt liking for theories involving conspiracies. -- This doesn’t oblige Brillyant to disbelieve every theory with a conspiracy in, because some actually have good evidence or are highly plausible for other reasons. Those tend not to be the ones labelled “conspiracy theory”.)
I think you misunderstand my concern; perhaps I have not been clear enough. I am not so much worried about equivocation, as I am worried by precisely the 3-step process which you describe. And I am particularly worried about people going through that process, labelling something a “conspiracy theory,” then the theory turns out to be true, and they never reassess their premises.
Let’s restate your process in more neutral language.
For reasons of specialisation, partial information, etc, I treat the fact that lots of people believe in a theory as partial evidence in its favour.
Some people have a higher prior than me for the existence of conspiracies.
Therefore if a theory involving a conspiracy is believed by quite a lot of people, it may be that this belief is due to their higher prior for conspiracies, not any special knowledge or expertise that I need to defer to.
Therefore I treat their belief in the theory as less evidence than normal, on the basis that if I had the evidence/expertise/etc that they do, I would be less likely than them to conclude that there is a conspiracy.
So if someone offers an implausible-seeming theory to me involving a conspiracy, I discount it and conclude that its advocates just have a high prior for conspiracies.
Suddenly, this doesn’t look like a sound epistemological process at all. Steps (1) and (2) are fine, but (3), (4) and (5) go increasingly off the rails. It looks like you are deliberately shielding your anti-conspiracies prior, by discounting (even beyond their initial level of plausibility) theories that might challenge it. And if, on those occasions that a conspiracy is eventually proven, you refuse to update your prior on the likelihood of conspiracies (by insisting that such-and-such a theory doesn’t really count as a conspiracy theory, even though, at the time, you were happy to label it as such), then I would say that the process has become truly pathological, just as much as that of a “conspiracy theorist.”
Consider: why do some people have that higher prior? Mightn’t that higher prior be itself part of their tacit knowledge and expertise—in the same way that a doctor’s prior on the cause of a set of symptoms is not because he ‘likes’ to diagnose people with tuberculosis, but due to his own updates on past experience to which you are not privy. Aren’t we doing precisely the wrong thing by discounting the theory in response to the prior?
None of this means that you should become a 9-11 Truther, of course. But consider the 1999 Moscow bombings. I don’t have any particular evidence about the events, but there’s a plausible case that they were an FSB conspiracy. Shouldn’t that make you more willing to believe the hypothetical in the grandparent than otherwise? In my own experience, people who are most likely to believe in conspiracy theories are those who had their formative experiences in dictatorial countries where there really are lots of conspiracies—and so they subsequently see them everywhere. But by symmetry, it follows that those of us brought up in the West will be too reluctant to see conspiracies elsewhere.
(I’m not sure that #2 is the right formulation. A lot of people don’t think in terms sufficiently close to Bayesian inference that talking about their “priors” really makes sense. I’m not sure this is more than nit-picking, though.)
I agree that #3,4,5 “go increasingly off the rails” but I think what goes off the rails is your description, as much as the actual mental process it aims to describe. Specifically, I think you are making the following claims and blaming them on the term “conspiracy theory”:
That when someone thinks something is a “conspiracy theory” they discount it not only in the sense of thinking it less likely than they otherwise would have, but in the stronger sense of dismissing it completely.
That they are then immune to further evidence that might (if they were rational) lead them to accept the theory after all.
That if the theory eventually turns out to have been right, they don’t update their estimate for how much to discount theories on account of being suspiciously conspiracy-based.
Now, I dare say many people do do just those things. After all, many people do all kinds of highly irrational things. But unless I’m badly misreading you, you are claiming specifically that I and Brillyant do them, and you are laying much of the blame for this on the usage of the term “conspiracy theory”, and I think both parts of this are wrong.
Yup. But the answer to that question is always yes, and therefore tells us nothing. (Mightn’t a creationist’s higher prior on the universe being only 6000 years old be part of their tacit knowledge and expertise? It might be, but I wouldn’t bet on it.)
I don’t think the symmetry is quite there. People brought up in totalitarian countries who then move to liberal democracies see too many conspiracies. No doubt people brought up in liberal democracies who then move to totalitarian countries see too few, but it could still be that people brought up in totalitarian countries who stay there and people brought up in liberal democracies who stay there both see approximately the right number of conspiracies.