The problem you have is the one shared by everyone from devotees of parapsychology to people who believe Meredith Kercher was killed in an orgy initiated by Amanda Knox: your prior on your theory is simply way too high.
Simply put, the events of 9/11 are so overwhelmingly more likely a priori to have been the exclusive work of a few terrorists than the product of a conspiracy involving the U.S. government, that the puzzling details you cite, even in their totality, fail to make a dent in a rational observer’s credence of (more or less) the official story.
You might try asking yourself: if the official story were in fact correct, wouldn’t you nevertheless expect that there would be strange facts that appear difficult to explain, and that these facts would be seized upon by conspiracy theorists, who, for some reason or another, were eager to believe the government may have been involved? And that they would be able to come up with arguments that sound convincing?
I want to stress that it is not the fact that the terrorists-only theory is officially sanctioned that makes it the (overwhelming) default explanation; as the Kercher case illustrates, sometimes the official story is an implausible conspiracy theory! Rather, it is our background knowledge of how reality operates—which must be informed, among other things, by an acquaintance with human cognitive biases.
“Not silencing skeptical inquiry” is a great-sounding applause light, but we have to choose our battles, for reasons more mathematical than social: there are simply too many conceivable explanations for any given phenomenon, for it it be worthwhile to consider more than a very small proportion of them. Our choice of which to consider in the first place is thus going to be mainly determined by our prior probabilities—in other words, our model of the world. Under the models of most folks here, 9/11 conspiracy theories simply aren’t going to get any time of day.
If it’s different for you, I’d be curious to know what kind of ideas with substantial numbers of adherents you would feel safe in dismissing without bothering to research. (If there aren’t any, then I think you severely overestimate the tendency of people’s beliefs to be entangled with reality.)
“Not silencing skeptical inquiry” is a great-sounding applause light
The main issue with it has been noted multiple times by people like Dawkins: there is an effort asymmetry between plucking a false but slightly believable theory out of thin air, and actually refuting that same theory. Making shit up takes very little effort, while rationally refuting random made-up shit takes the same effort as rationally refuting theories whose refutation yields actual intellectual value. Creationists can open a hundred false arguments at very little intellectual cost, and if they are dismissed out of hand by the scientific establishment they get to cry “suppression of skeptical inquiry”.
This feels related to pjeby’s recent comments about curiosity. The mere feeling that “there’s something odd going on here”, followed by the insistence that other people should inquire into the odd phenomenon, isn’t valid curiosity. That’s only ersatz curiosity. Real curiosity is what ends up with you actually constructing a refutable hypothesis, and subjecting it to at least the kind of test that a random person from the Internet would perform—before actually publishing your hypothesis, and insisting that others should consider it carefully.
Inflicting random damage on other people’s belief networks isn’t promoting “skeptical inquiry”, it’s the intellectual analogue of terrorism.
Perhaps “asymmetric warfare” would be a better term than “terrorism”. More general, and without the connotations which I agree make that last line something of an exaggeration.
Again, you’re addressing a straw man—not my actual arguments. I do not claim that the government was responsible for 9/11; I believe the evidence, if properly examined, would probably show this—but my interest is in showing that the existing explanations are not just inadequate but clearly wrong.
So, okay, how would you tell the difference between an argument that “sounds convincing” and one which should actually be considered rationally persuasive?
My use of the “applause light” was an attempt to use emotion to get through emotional barriers preventing rational examination. Was it inappropriate?
“There are simply too many conceivable explanations for any given phenomenon for it to be worthwhile to consider more than a very small proportion of them.”
I agree. Many of the conclusions reached by the 9/11 Commission are, however, not among that small proportion. Many questions to which we need answers were not even addressed by the Commission. (Your statement here strikes me as a “curiosity stopper”.)
Under the models of most folks here, 9/11 conspiracy theories simply aren’t going to get any time of day.
This is the problem, yes. What’s your point?
I’d be curious to know what kind of ideas with substantial numbers of adherents you would feel safe in dismissing without bothering to research.
None that I can think of. Again, what’s your point? I am not “dismissing” the dominant conclusion, I am questioning it. I have, in fact, done substantial amounts of research (probably more than anyone reading this). If anyone is actually dismissing an idea with substantial numbers of adherents, it is those who dismiss “truthers” without actually listening to their arguments.
Are you arguing that “people are irrational, so you might as well give up”?
So, okay, how would you tell the difference between an argument that “sounds convincing” and one which should actually be considered rationally persuasive?
It’s not an easy problem, in general—hence LW!
But we can always start by doing the Bayesian calculation. What’s your prior for the hypothesis that the U.S, government was complicit in the 9/11 attacks? What’s your estimate of the strength of each of those pieces of evidence you think is indicative of a conspiracy?
I’d be curious to know what kind of ideas with substantial numbers of adherents you would feel safe in dismissing without bothering to research.
None that I can think of. Again, what’s your point? I am not “dismissing” the dominant conclusion, I am questioning it.
You misunderstood. I was talking about your failure to dismiss 9/11 conspiracy theories. I was asking whether there were any conspiracy theories that you would be willing to dismiss without research.
Again, I think this question is a diversion from what I have been arguing; its truth or falseness does not substantially affect the truth or falseness of my actual claims (as opposed to beliefs mentioned in passing).
That said, I made a start at a Bayesian analysis, but ran out of mental swap-space. If someone wants to suggest what I need to do next, I might be able to do it.
Also vaguely relevant—this matrix is set up much more like a classical Bayesian word-problem: it lists the various pieces of evidence which we would expect to observe for each known manner in which a high-rise steel-frame building might run down the curtain and join the choir invisible, and then shows what was actually observed in the cases of WTC1, 2, and 7.
Is there enough information there to calculate some odds, or are there still bits missing?
You misunderstood. I was talking about your failure to dismiss 9/11 conspiracy theories. I was asking whether there were any conspiracy theories that you would be willing to dismiss without research.
No, not really. I think of that as my “job” at Issuepedia: don’t dismiss anything without looking at it. Document the process of examination so that others don’t have to repeat it, and so that those who aren’t sure what to believe can quickly see the evidence for themselves (rather than having to go collect it) -- and can enter in any new arguments or questions they might have.
Does that process seem inherently flawed somehow? I’m not sure what you’re suggesting by your use of the word “failure” here.
(Some folks have expressed disapproval of this conversation continuing in this thread; ironically, though, it’s becoming more and more an explicit lesson in Bayesianism—as this comment in particular will demonstrate. Nevertheless, after this comment, I am willing to move it elsewhere, if people insist.)
Again, I think this question is a diversion from what I have been arguing; its truth or falseness does not substantially affect the truth or falseness of my actual claims (as opposed to beliefs mentioned in passing)
You’re in Bayes-land here, not a debating society. Beliefs are what we’re interested in. There’s no distinction between an argument that a certain point of view should be taken seriously and an argument that the point of view in question has a significant probability of being true. If you want to make a case for the former, you’ll necessarily have to make a case for the latter.
That said, I made a start at a Bayesian analysis, but ran out of mental swap-space. If someone wants to suggest what I need to do next, I might be able to do it.
Here’s how you do a Bayesian analysis: you start with a prior probability P(H). Then you consider how much more likely the evidence is to occur if your hypothesis is true (P(E|H)) than it is in general (P(E)) -- that is, you calculate P(E|H)/P(E). Multiplying this “strength of evidence” ratio P(E|H)/P(E) by the prior probability P(H) gives you your posterior (updated) probability P(H|E).
Alternatively, you could think in terms of odds: starting with the prior odds P(H)/P(~H), and considering how much more likely the evidence is to occur if your hypothesis is true (P(E|H)) than if it is false (P(E|~H)); the ratio P(E|H)/P(E|~H) is called the “likelihood ratio” of the evidence. Multiplying the prior odds by the likelihood ratio gives you the posterior odds P(H|E)/P(~H|E).
One of the two questions you need to answer is: by what factor do you think the evidence raises the probability/odds of your hypothesis being true? Are we talking twice as likely? Ten times? A hundred times?
If you know that, plus your current estimate of how likely your hypothesis is, division will tell you what your prior was—which is the other question you need to answer.
Is there enough information there to calculate some odds, or are there still bits missing?
If there’s enough information for you to have a belief, then there’s enough information to calculate the odds. Because, if you’re a Bayesian, that’s what these numbers represent in the first place: your degree of belief.
I’m not sure what you’re suggesting by your use of the word “failure” here
“Your failure to dismiss...” is simply an English-language locution that means “The fact that you did not dismiss...”
Simply put, the events of 9/11 are so overwhelmingly more likely a priori to have been the exclusive work of a few terrorists than the product of a conspiracy involving the U.S. government
Based on what facts do you think so?
it is our background knowledge of how reality operates—which must be informed, among other things, by an acquaintance with human cognitive biases.
Where did you get your background knowledge in regards to terrorism and geopolitics from?
The way you argue is the way the average person thinks, because the average has never been able to look behind the scenes of what happens in politics and instead gets his news from the media.
The problem you have is the one shared by everyone from devotees of parapsychology to people who believe Meredith Kercher was killed in an orgy initiated by Amanda Knox: your prior on your theory is simply way too high.
Simply put, the events of 9/11 are so overwhelmingly more likely a priori to have been the exclusive work of a few terrorists than the product of a conspiracy involving the U.S. government, that the puzzling details you cite, even in their totality, fail to make a dent in a rational observer’s credence of (more or less) the official story.
You might try asking yourself: if the official story were in fact correct, wouldn’t you nevertheless expect that there would be strange facts that appear difficult to explain, and that these facts would be seized upon by conspiracy theorists, who, for some reason or another, were eager to believe the government may have been involved? And that they would be able to come up with arguments that sound convincing?
I want to stress that it is not the fact that the terrorists-only theory is officially sanctioned that makes it the (overwhelming) default explanation; as the Kercher case illustrates, sometimes the official story is an implausible conspiracy theory! Rather, it is our background knowledge of how reality operates—which must be informed, among other things, by an acquaintance with human cognitive biases.
“Not silencing skeptical inquiry” is a great-sounding applause light, but we have to choose our battles, for reasons more mathematical than social: there are simply too many conceivable explanations for any given phenomenon, for it it be worthwhile to consider more than a very small proportion of them. Our choice of which to consider in the first place is thus going to be mainly determined by our prior probabilities—in other words, our model of the world. Under the models of most folks here, 9/11 conspiracy theories simply aren’t going to get any time of day.
If it’s different for you, I’d be curious to know what kind of ideas with substantial numbers of adherents you would feel safe in dismissing without bothering to research. (If there aren’t any, then I think you severely overestimate the tendency of people’s beliefs to be entangled with reality.)
The main issue with it has been noted multiple times by people like Dawkins: there is an effort asymmetry between plucking a false but slightly believable theory out of thin air, and actually refuting that same theory. Making shit up takes very little effort, while rationally refuting random made-up shit takes the same effort as rationally refuting theories whose refutation yields actual intellectual value. Creationists can open a hundred false arguments at very little intellectual cost, and if they are dismissed out of hand by the scientific establishment they get to cry “suppression of skeptical inquiry”.
This feels related to pjeby’s recent comments about curiosity. The mere feeling that “there’s something odd going on here”, followed by the insistence that other people should inquire into the odd phenomenon, isn’t valid curiosity. That’s only ersatz curiosity. Real curiosity is what ends up with you actually constructing a refutable hypothesis, and subjecting it to at least the kind of test that a random person from the Internet would perform—before actually publishing your hypothesis, and insisting that others should consider it carefully.
Inflicting random damage on other people’s belief networks isn’t promoting “skeptical inquiry”, it’s the intellectual analogue of terrorism.
I like this comment lots, but I think this comparison is inadvisable hyperbole.
Perhaps “asymmetric warfare” would be a better term than “terrorism”. More general, and without the connotations which I agree make that last line something of an exaggeration.
Again, you’re addressing a straw man—not my actual arguments. I do not claim that the government was responsible for 9/11; I believe the evidence, if properly examined, would probably show this—but my interest is in showing that the existing explanations are not just inadequate but clearly wrong.
So, okay, how would you tell the difference between an argument that “sounds convincing” and one which should actually be considered rationally persuasive?
My use of the “applause light” was an attempt to use emotion to get through emotional barriers preventing rational examination. Was it inappropriate?
I agree. Many of the conclusions reached by the 9/11 Commission are, however, not among that small proportion. Many questions to which we need answers were not even addressed by the Commission. (Your statement here strikes me as a “curiosity stopper”.)
This is the problem, yes. What’s your point?
None that I can think of. Again, what’s your point? I am not “dismissing” the dominant conclusion, I am questioning it. I have, in fact, done substantial amounts of research (probably more than anyone reading this). If anyone is actually dismissing an idea with substantial numbers of adherents, it is those who dismiss “truthers” without actually listening to their arguments.
Are you arguing that “people are irrational, so you might as well give up”?
This is a flat-out Bayesian contradiction.
It’s not an easy problem, in general—hence LW!
But we can always start by doing the Bayesian calculation. What’s your prior for the hypothesis that the U.S, government was complicit in the 9/11 attacks? What’s your estimate of the strength of each of those pieces of evidence you think is indicative of a conspiracy?
You misunderstood. I was talking about your failure to dismiss 9/11 conspiracy theories. I was asking whether there were any conspiracy theories that you would be willing to dismiss without research.
Again, I think this question is a diversion from what I have been arguing; its truth or falseness does not substantially affect the truth or falseness of my actual claims (as opposed to beliefs mentioned in passing).
That said, I made a start at a Bayesian analysis, but ran out of mental swap-space. If someone wants to suggest what I need to do next, I might be able to do it.
Also vaguely relevant—this matrix is set up much more like a classical Bayesian word-problem: it lists the various pieces of evidence which we would expect to observe for each known manner in which a high-rise steel-frame building might run down the curtain and join the choir invisible, and then shows what was actually observed in the cases of WTC1, 2, and 7.
Is there enough information there to calculate some odds, or are there still bits missing?
No, not really. I think of that as my “job” at Issuepedia: don’t dismiss anything without looking at it. Document the process of examination so that others don’t have to repeat it, and so that those who aren’t sure what to believe can quickly see the evidence for themselves (rather than having to go collect it) -- and can enter in any new arguments or questions they might have.
Does that process seem inherently flawed somehow? I’m not sure what you’re suggesting by your use of the word “failure” here.
(Some folks have expressed disapproval of this conversation continuing in this thread; ironically, though, it’s becoming more and more an explicit lesson in Bayesianism—as this comment in particular will demonstrate. Nevertheless, after this comment, I am willing to move it elsewhere, if people insist.)
You’re in Bayes-land here, not a debating society. Beliefs are what we’re interested in. There’s no distinction between an argument that a certain point of view should be taken seriously and an argument that the point of view in question has a significant probability of being true. If you want to make a case for the former, you’ll necessarily have to make a case for the latter.
Here’s how you do a Bayesian analysis: you start with a prior probability P(H). Then you consider how much more likely the evidence is to occur if your hypothesis is true (P(E|H)) than it is in general (P(E)) -- that is, you calculate P(E|H)/P(E). Multiplying this “strength of evidence” ratio P(E|H)/P(E) by the prior probability P(H) gives you your posterior (updated) probability P(H|E).
Alternatively, you could think in terms of odds: starting with the prior odds P(H)/P(~H), and considering how much more likely the evidence is to occur if your hypothesis is true (P(E|H)) than if it is false (P(E|~H)); the ratio P(E|H)/P(E|~H) is called the “likelihood ratio” of the evidence. Multiplying the prior odds by the likelihood ratio gives you the posterior odds P(H|E)/P(~H|E).
One of the two questions you need to answer is: by what factor do you think the evidence raises the probability/odds of your hypothesis being true? Are we talking twice as likely? Ten times? A hundred times?
If you know that, plus your current estimate of how likely your hypothesis is, division will tell you what your prior was—which is the other question you need to answer.
If there’s enough information for you to have a belief, then there’s enough information to calculate the odds. Because, if you’re a Bayesian, that’s what these numbers represent in the first place: your degree of belief.
“Your failure to dismiss...” is simply an English-language locution that means “The fact that you did not dismiss...”
This thread doesn’t belong under the “What is Bayesianism” post. I advise taking it to the older post that discussed “Truthers”.
Based on what facts do you think so?
Where did you get your background knowledge in regards to terrorism and geopolitics from?
The way you argue is the way the average person thinks, because the average has never been able to look behind the scenes of what happens in politics and instead gets his news from the media.