Tyler: If there really is “bad epistemology”, feel free to show where.
Nesov: Also, FOOM rhymes with DOOM. There!
And this response was upvoted … why? This is supposed to be a site where rational discourse is promoted, not a place like Pharyngula or talk.origins where folks who disagree with the local collective worldview get mocked by insiders who then congratulate each other on their cleverness.
I voted it up. It was short, neat, and made several points.
Probably the main claim is that that the relationship between the SIAI and previous END OF THE WORLD outfits is a meaningless surface resemblance.
My take of the issue is that DOOM is—in part—a contagious mind-virus, with ancient roots—which certain “vulnerable” people are inclined to spread around—regardless of whether it makes much sense or not.
With the rise of modern DOOM “outfits”, we need to understand the sociological and memetic aspects of these things all the more:
Will we see more cases of “DOOM exploitation”—from those out to convert fear of the imminent end into power, wealth, fame or sex?
Will a paranoid society take steps to avoid the risks? Will it freeze like a rabbit in the headlights? Or will it result in more looting and rape cases?
What is the typical life trajectory of those who get involved with these outfits? Do they go on to become productive members of society? Or do they wind up having nightmares about THE END OF THE WORLD—while neglecting their interpersonal relationships and personal hygene—unless their friends and family stage an “intervention”?
...and so on.
Rational agents should understand the extent to which they are infected by contagious mind viruses—that spread for their own benefit and without concern for the welfare of their hosts. DOOM definitely has the form of such a virus. The issue as I see it is: how much of the observed phenomenon of the of modern-day DOOM “outfits” does it explain?
To study this whole issue, previous doomsday cults seem like obvious and highly-relevant data points to me. In some cases their DOOM was evidently a complete fabrication. They provide pure examples of fake DOOM—exactly the type of material a sociologist would need to understand that aspect of the DOOM-mongering phenomeon.
I agree that it’s annoying when people are mocked for saying something they didn’t say. But Nesov was actually making an implicit argument here, not just having fun: he was pointing out that timtyler’s analogies tend to be surface-level and insubstantive. The kind of thing that I’ve seen on Pharyngula are instead unjustified ad hominem attacks that don’t shed any light on possible flaws in the poster’s arguments. That said, I think Nesov’s comment was flirting with the line.
“Way past that” meaning “so exasperated with Tim that rational discourse seems just not worth it”? Hey, I can sympathize. Been there, done that.
But still, it annoys me when people are attacked by mocking something that they didn’t say, but that their caricature should have said (in a more amusing branch of reality).
It annoys me more when that behavior is applauded.
And it strikes me as deeply ironic when it happens here.
But still, it annoys me when people are attacked by mocking something that they didn’t say, but that their caricature should have said (in a more amusing branch of reality)
That’s very neatly put.
I’m not dead certain it’s a fair description of Vladimir Nesov said, but describes a lot of behavior I’ve seen. And there’s a parallel version about the branches of reality which allow for easier superiority and/or more outrage.
The error Tim makes time and again is finding shallow analogies between activity of people concerned with existential risk and doomsday cults, and loudly announcing them, lamenting that it’s not proper that this important information is so rarely considered. Yet the analogies are obvious and obviously irrelevant. My caricature simply followed the pattern.
Talking about obviousness as if it was inherent in a conclusion is typical mind projection fallacy. What it generally implies (and what I think you mean) is that any sufficiently rational person would see it; but when lots of people don’t see it, calling it obvious is against social convention (it’s claiming higher rationality and thus social status than your audience). In this case I think that to your average reader the analogies aren’t obviously irrelevant, even though I personally do find them obviously irrelevant.
When you’re trying to argue that something is the case (ie. that the analogies are irrelevant) the difference between what you are arguing being OBVIOUS and it merely being POSSIBLE is extremely vast.
You made a claim that they were obviously irrelevant.
The respondant expressed uncerainty as to their irrelevance “They may be irrelevant.” as opposed to the certainty in “The analogies are obvious.” and “They are not obviously irrelevant.”
That is a distinction between something being claimed as obvious and the same thing being seen as doubtful.
If you do not wish to explain a point there are many better options* than inaccurately calling it obvious. For example, linking to a previous explanation.
*in rationality terms. In argumentation terms, these techniques are often inferior to the technique of the emperor’s tailors
The error Tim makes time and again is finding shallow analogies between activity of people concerned with existential risk and doomsday cults, and loudly announcing them, lamenting that it’s not proper that this important information is so rarely considered. Yet the analogies are obvious and obviously irrelevant.
Uh, they are not “obviously irrelevant”. The SIAI behaves a bit like other DOOM-mongering organisations have done—and a bit like other FUD marketing organisations have done.
Understanding the level of vulnerability of the human psyche to the DOOM virus is a pretty critical part of assessing what level of paranoia about the topic is reasonable.
It is, in fact very easy to imagine how a bunch of intrepid “friendly folk” who think they are out to save the world—might—in the service of their cause—exaggerate the risks, in the hope of getting attention, help and funds.
Indeed, such an organisation is most likely to be founded by those who have extreme views about the risks, attract others who share similar extreme views, and then have a hard time convincing the rest of the world that they are, in fact, correct.
There are sociological and memetic explanations for the “THE END IS NIGH” phenomenon that are more-or-less independent of the actual value of p(DOOM). I think these should be studied more, and applied to this case—so that we can better see what is left over.
There has been some existing study of DOOM-mongering. There is also the associated Messiah complex—an intense desire to save others. With the rise of the modern doomsday “outfits”, I think more study of these phenomenon is warranted.
Sometimes it is fear that is the mind-killer. FUD marketing exploits this to help part marks from their money. THE END OF THE WORLD is big and scary—a fear superstimulus—and there is a long tradition of using it to move power around and achieve personal ends—and the phenomena spreads around virally.
I appreciate that this will probably turn the stomachs of the faithful—but without even exploring the issue, you can’t competently defend the community against such an analysis—because you don’t know to what extent it is true—because you haven’t even looked into it.
And this response was upvoted … why? This is supposed to be a site where rational discourse is promoted, not a place like Pharyngula or talk.origins where folks who disagree with the local collective worldview get mocked by insiders who then congratulate each other on their cleverness.
I voted it up. It was short, neat, and made several points.
Probably the main claim is that that the relationship between the SIAI and previous END OF THE WORLD outfits is a meaningless surface resemblance.
My take of the issue is that DOOM is—in part—a contagious mind-virus, with ancient roots—which certain “vulnerable” people are inclined to spread around—regardless of whether it makes much sense or not.
With the rise of modern DOOM “outfits”, we need to understand the sociological and memetic aspects of these things all the more:
Will we see more cases of “DOOM exploitation”—from those out to convert fear of the imminent end into power, wealth, fame or sex?
Will a paranoid society take steps to avoid the risks? Will it freeze like a rabbit in the headlights? Or will it result in more looting and rape cases?
What is the typical life trajectory of those who get involved with these outfits? Do they go on to become productive members of society? Or do they wind up having nightmares about THE END OF THE WORLD—while neglecting their interpersonal relationships and personal hygene—unless their friends and family stage an “intervention”?
...and so on.
Rational agents should understand the extent to which they are infected by contagious mind viruses—that spread for their own benefit and without concern for the welfare of their hosts. DOOM definitely has the form of such a virus. The issue as I see it is: how much of the observed phenomenon of the of modern-day DOOM “outfits” does it explain?
To study this whole issue, previous doomsday cults seem like obvious and highly-relevant data points to me. In some cases their DOOM was evidently a complete fabrication. They provide pure examples of fake DOOM—exactly the type of material a sociologist would need to understand that aspect of the DOOM-mongering phenomeon.
I agree that it’s annoying when people are mocked for saying something they didn’t say. But Nesov was actually making an implicit argument here, not just having fun: he was pointing out that timtyler’s analogies tend to be surface-level and insubstantive. The kind of thing that I’ve seen on Pharyngula are instead unjustified ad hominem attacks that don’t shed any light on possible flaws in the poster’s arguments. That said, I think Nesov’s comment was flirting with the line.
In the case of Tim in particular, I’m way past that.
“Way past that” meaning “so exasperated with Tim that rational discourse seems just not worth it”? Hey, I can sympathize. Been there, done that.
But still, it annoys me when people are attacked by mocking something that they didn’t say, but that their caricature should have said (in a more amusing branch of reality).
It annoys me more when that behavior is applauded.
And it strikes me as deeply ironic when it happens here.
That’s very neatly put.
I’m not dead certain it’s a fair description of Vladimir Nesov said, but describes a lot of behavior I’ve seen. And there’s a parallel version about the branches of reality which allow for easier superiority and/or more outrage.
The error Tim makes time and again is finding shallow analogies between activity of people concerned with existential risk and doomsday cults, and loudly announcing them, lamenting that it’s not proper that this important information is so rarely considered. Yet the analogies are obvious and obviously irrelevant. My caricature simply followed the pattern.
The analogies are obvious. They may be irrelevant. They are not obviously irrelevant.
Too fine a distinction to argue, wouldn’t you agree?
Talking about obviousness as if it was inherent in a conclusion is typical mind projection fallacy. What it generally implies (and what I think you mean) is that any sufficiently rational person would see it; but when lots of people don’t see it, calling it obvious is against social convention (it’s claiming higher rationality and thus social status than your audience). In this case I think that to your average reader the analogies aren’t obviously irrelevant, even though I personally do find them obviously irrelevant.
When you’re trying to argue that something is the case (ie. that the analogies are irrelevant) the difference between what you are arguing being OBVIOUS and it merely being POSSIBLE is extremely vast.
You seem to confuse the level of certainty with difficulty of discerning it.
You made a claim that they were obviously irrelevant.
The respondant expressed uncerainty as to their irrelevance “They may be irrelevant.” as opposed to the certainty in “The analogies are obvious.” and “They are not obviously irrelevant.”
That is a distinction between something being claimed as obvious and the same thing being seen as doubtful.
If you do not wish to explain a point there are many better options* than inaccurately calling it obvious. For example, linking to a previous explanation.
*in rationality terms. In argumentation terms, these techniques are often inferior to the technique of the emperor’s tailors
Uh, they are not “obviously irrelevant”. The SIAI behaves a bit like other DOOM-mongering organisations have done—and a bit like other FUD marketing organisations have done.
Understanding the level of vulnerability of the human psyche to the DOOM virus is a pretty critical part of assessing what level of paranoia about the topic is reasonable.
It is, in fact very easy to imagine how a bunch of intrepid “friendly folk” who think they are out to save the world—might—in the service of their cause—exaggerate the risks, in the hope of getting attention, help and funds.
Indeed, such an organisation is most likely to be founded by those who have extreme views about the risks, attract others who share similar extreme views, and then have a hard time convincing the rest of the world that they are, in fact, correct.
There are sociological and memetic explanations for the “THE END IS NIGH” phenomenon that are more-or-less independent of the actual value of p(DOOM). I think these should be studied more, and applied to this case—so that we can better see what is left over.
There has been some existing study of DOOM-mongering. There is also the associated Messiah complex—an intense desire to save others. With the rise of the modern doomsday “outfits”, I think more study of these phenomenon is warranted.
Sometimes it is fear that is the mind-killer. FUD marketing exploits this to help part marks from their money. THE END OF THE WORLD is big and scary—a fear superstimulus—and there is a long tradition of using it to move power around and achieve personal ends—and the phenomena spreads around virally.
I appreciate that this will probably turn the stomachs of the faithful—but without even exploring the issue, you can’t competently defend the community against such an analysis—because you don’t know to what extent it is true—because you haven’t even looked into it.