Oh, you mean I should make it clear that Scientology is dangerous and people shouldn’t take Scientology classes? I figured that would be obvious, but okay: I added it to the post.
I think your disclaimer looks too much like an implicit challenge: “I dabbled with Scientology classes but didn’t get hooked because I’m that rational/self-disciplined/awesome; but you shouldn’t try it because you’re probably not as awesome, and you might get reeled in.”
The real history of the disclaimer, though, is more like, “I dabbled and didn’t get hooked because I’m awesome, and I didn’t warn you about it at first because I think you’re awesome, but David Gerard thinks otherwise and he twisted my arm.”
For my part, I appreciated having my awesomeness recognized, however briefly. It’s not every day that other people notice that about me. :)
I am in fact just a big meanie about this stuff. “Dad just won’t let me get into the really good mind controlling, he’s so oppressive. Where are my Sea Org teenage minions? This is sooo bogus.”
I am in fact just a big meanie about this stuff. “Dad just won’t let me get into the really good mind controlling, he’s so oppressive. Where are my Sea Org teenage minions? This is sooo bogus.
If in 25 years any of your kids run an international cult I’m blaming you.
I’m not convinced “p.s.: don’t do this thing that worked out really well for me and I shall now describe in thrilled detail” entirely makes it no longer functionally a personal recommendation, but it’s possibly better than nothing. Thank you.
Yes but LessWrong is a lot like this—witness all the discussions in thrilled detail of drugs that put your brain into a more effective/enjoyable state. It’s assumed that the readership is intelligent/responsible enough to handle this sort of thing.
The desire to succeed in unorthodox ways (“cheat” at life) is strong in many members of this community—Luke’s Scientology story fits that pattern very well. It certainly makes me want to try a com course and I’ve read about Scientology in endless detail—including some of your work.
Sewer-diving could be fun, and instructive! But a note or few about adequate preparation first strikes me as a really good idea. Particularly when the story turns out to be “and then I swallowed this sample of engineered resistant mycobacterium tuberculosis, and I felt great.” Hubris is one of the dangers of a little knowledge.
How did you come to the conclusion that the parent of the comment containing this sentence was a good comment to post?
Are you attempting to direct me on an endlessly-recurring chain of justification? At some point, reflection must stop and action must be taken, or else you will use up all free energy and entropize just thinking of your next action. Correct reasoning teaches you this very quickly.
How did you come to the conclusion that the parent of the comment containing this sentence was a good comment to post?
By heuristic based processing, as with how I do most things. It seems reasonable to assume that the same isn’t true of you, though, so I expected a rather more useful answer to my question. (Relevant heuristics include ‘if confused, ask for information’ and ‘alert friend-type people to mistakes so that they can avoid those mistakes in the future’.)
Are you attempting to direct me on an endlessly-recurring chain of justification? At some point, reflection must stop and action must be taken, or else you will use up all free energy and entropize just thinking of your next action. Correct reasoning teaches you this very quickly.
I wasn’t, actually. I suspect that whatever system you used to decide to make that post is poorly calibrated, and intended to offer help in debugging it. It’s also possible that my model of you is not as accurate as it could be, and that’s what needs debugging. In either case, gathering more information is a reasonable early step in the process.
By heuristic based processing, as with how I do most things. It seems reasonable to assume that the same isn’t true of you, though, so I expected a rather more useful answer to my question. (Relevant heuristics include ‘if confused, ask for information’ and ‘alert friend-type people to mistakes so that they can avoid those mistakes in the future’.)
I also use heuristic reasoning, (governed by the meta-heuristic of correct reasoning), and here I thought that User:David_Gerard was significantly overstating the risks of sewer-diving and Scientology classes for humans. Therefore, I added my “independent component” to the discussion.
You shouldn’t troll groups, even if you deem them evil and dangerous, for much the same reason that you shouldn’t (EDIT: previous post had “should”) murder their members.
Yes but LessWrong is a lot like this—witness all the discussions in thrilled detail of drugs that put your brain into a more effective/enjoyable state. It’s assumed that the readership is intelligent/responsible enough to handle this sort of thing.
The outside culture has enough warnings about dangers of using drugs that we don’t have to repeat them here. Everybody knows that playing with them can fry your brain, and you should take proper precautions. I don’t think the outside culture has enough warnings about psychological manipulation techniques in general, nor this particular sect. People routinely think they’ll be less influenced than they are.
And there’s also the thing that while the people who hang around at LW probably have more ammo than usual against the overt bullshit of cults, they also might have some traits that make them more susceptible to cult recruitment. Namely, sparse social networks, which makes you vulnerable to a bunch of techniques that create the feeling of belonging and acceptance of the new community, and tolerance of practices and ideas outside the social mainstream, which gets cult belief systems that don’t immediately trigger bullshit warnings inside your head.
The Aum Shinrikyo cult in Japan that did the subway sarin gas thing reportedly recruited lots of science and engineering students. An engineering mindset will also keep you working from the internalized bullshit against social proof, since science and engineering is a lot about about how weird stuff extrapolated beyond conventional norms works and gives results.
tl;dr: You’re not as smart as you think, probably have a mild mood disorder from lack of satisfactory social interaction, and have no idea how you’ll subconsciously react to direct cult brainwashing techniques. Don’t mess with cults.
How about a word on the major religions? The most obvious difference between a cult and a religion is that the religion is many orders of magnitude more successful at recruitment—which is the very thing that we are being warned about with respect to cults.
The Mormons are a good comparison. They were dangerous lunatics in the mid-1800s—and Brigham Young was a murderous nutter on a par with David Miscavige. These days, they’re slightly weirdy but very nice (if very, very conservative) people; good neighbours.
You must mean “kill off” metaphorically, since I don’t recall any incidents in which Scientology has killed off Scientologitsts. In contrast I can recall many very recent incidents in which one old religion—Islam—has killed off adherents. But if “kill off” is a metaphor, then what is the literal danger from Scientology which is being referred to metaphorically as “kill off the host”?
I would caution against using “I don’t recall” to mean “I haven’t researched even slightly”.
I used “I don’t recall” to mean “I don’t recall”. Go ahead and bash me for failing to research the question but please don’t put your words and ideas in my writing.
I think David’s point is that when you say “I don’t recall X”, it matters very much whether you would recall an X to begin with, i.e., whether P(“I recall X” | X has happened) is significantly larger than P(“I don’t recall X” | X has happened). So when you offer up “I don’t recall X”, people assume you’re doing it because the former is larger than the latter.
But if that’s not the case, then you are, in effect, using “I don’t recall” to mean “I haven’t researched”, and this is why David was accusing you of blurring the distinction.
No, you’re inventing my meaning on the basis of a convoluted reading, and you’re neglecting the context. What I said was that I do not recall. And that is true. In context, the issue is whether Scientology kills off its host quickly. I pointed out that Islam, which kills many of its own adherents, is classified (by the preceding comment, implicitly) as not killing off its host quickly. Therefore for Scientology to be classified as killing off its host quickly it must kill more of its own adherents than Islam does. So that is the relevant question.
So: how does David’s evidence address this question? Not very well. A woman died from negligence while in the care of co-religionists. This can barely be connected to the religion itself. When I said that Islam kills off many of its own adherents, I did not have in mind adherents dying from negligence while in the care of co-religionists. I had in mind jihad. But if we want to expand the definition of killing one’s own host, let us do so: let us take into account the economic backwardness caused by Islam in the Middle East. That should greatly increase the death toll of Islam. Which does not, by assumption, kill of its host.
So, David’s evidence is hardly pertinent to the question. If we expand the definition of killing of one’s host to accommodate it, then we must do the same for Islam, which makes Islam look very bad indeed.
Now let’s turn to my own evidence. I am an imperfect observer, who is not aware of everything that goes on in the world. But it doesn’t matter whether I am perfect. What matters is whether I’m biased. David says that I did not specially investigate Scientology. No, I didn’t. And also, I didn’t specially investigate Islam. So as an instrument, I am balanced in that respect. And my readout says: I am aware of many dead from Islam, none dead from Scientology. David says I missed one. Oh? And so what? I missed many on the Muslim side too.
The probability of being Muslim is a lot higher (about 1000 times more?) than of being Scientologist, so I presume you’re talking about how many incidents you’d expect to have heard about per capita.
I wish that you were either a more concise or less interesting writer, so that I wouldn’t waste time reading a detailed argument about what’s-been-said.
The probability of being Muslim is a lot higher (about 1000 times more?) than of being Scientologist, so I presume you’re talking about how many incidents you’d expect to have heard about per capita.
That’s one adjustment that needs to be made, though not the only one. The other major adjustment that needs to be made is for proximity. That goes in the opposite direction. But it’s not worthwhile thinking about it with current data—the energy should be spent on getting better data. I just did that for Afghanistan. 38,000 is the most recent figure I found for dead Taliban, who I interpret as seeing themselves as fighting Islamic jihad, seeing as the Taliban is an Islamic theocracy. Divide that by 1000 and you have 38 Scientologists who, going by your figure, need to have died in armed struggle with—I don’t know—the police, maybe, in order for Scientology to match the proportional death toll in religious violence. I’m pretty sure that if 38 Scientologists had died fighting the police, I would have heard about it, even though I didn’t specifically research the question.
And the Taliban is I think just a small part of everyone who died in the last decade in what they considered to be Islamic jihad.
Update—the factor of 1000 is way off. It’s 25,000 Scientologists versus about 1.5 billion Muslims. If, say, 150,000 Muslims have died in armed jihad in the past ten years, then that’s one in 10,000, which comes to two Scientologists who need to die in armed struggle to match proportions. So being a Muslim is probably not significantly more dangerous than being a Scientologist. Further information could reveal that it is less dangerous.
The membership statistics have been lies for decades. alt.religion.scientology worked out it was around 50k in the late 1990s; I’m surprised it’s as high as 25k now.
I agree that David’s point about Lisa McPherson isn’t counterevidence to the claim you made (or rather, were implying based on not recalling). I was replying only to your statement
I used “I don’t recall” to mean “I don’t recall” … please don’t put your words and ideas in my writing
which was ridiculing the very idea that someone would read your “I don’t recall” to mean “I don’t recall and that is informative in this case”, when people have good reason to do so, as I explained.
If your objection to David’s point was that the McPherson case is not evidence of Scientology “killing off its host”, then you should have said so in your reply at that point (and I would have agreed) rather than merely flaunt your non-standard usage of “I don’t recall” and insult the people who thought you were trying to say something relevant.
rather than merely flaunt your non-standard usage of “I don’t recall” and insult the people who thought you were trying to say something relevant.
You have it backwards. I asked someone not to put words into my mouth. There were much better and less rude ways for him to make the same point. I am not going to continue arguing this at length because Jonathan Graehl just said he doesn’t like being forced to read who-said-what.
Please do not use value-laden and unsupported terms such as “murder” here. Yes, there are some cases of controversial deaths involving Scienology, but none of these could be described as murder of either the formal or less-formal sort.
The existence of R2-45 is rather unsettling, but apparently this ‘auditing procedure’ has never been enacted.
Growth/attrition rates are actually the thing to look at here. Scientology is faster-growing than just about any other modern religion, though the attrition rate is also very high. In order to figure out virulency, figure out what population the S-curve of members of that religion will top out at. If growth is slowing, you’re almost there. If growth is steady, you’re about halfway there. If growth is exponential or approximately so, you’re looking at a religion in its infancy.
The Aum Shinrikyo cult in Japan that did the subway sarin gas thing reportedly recruited lots of science and engineering students. An engineering mindset will also keep you working from the internalized bullshit against social proof, since science and engineering is a lot about about how weird stuff extrapolated beyond conventional norms works and gives results.
Oddly, a “sense of belonging” usually makes me feel alienated and uncomfortable. It’s the rare exceptions like LessWrong, where it actually feels like I do fit, and am being challenged and growing and free to express myself, that avoid that.
I can take a shot at it, having experienced something similar.
The general situation usually follows the pattern of “There is a group with easily-noticeable standards A, B, and C and less-easily-noticeable standards X, Y, and Z. I conform to A, B, and C (though probably for different reasons than they do), but not to (some subset of) X , Y, and Z, but since X and Y and Z don’t come up very often, 1) they haven’t figured out that I don’t fit them, and 2) I didn’t realize that those standards were significant until after I’d been accepted as a member of the group (which is where the ‘sense of belonging’ comes in). At no point did I actually mislead the group with regards to X, Y, or Z, but it’s very likely that if they find out that I don’t conform to them, they will assume that I did and there will be large amounts of drama.”
This usually leads to an inclination to hide facts relating to X, Y, and Z, which feels from the inside like being alienated and uncomfortable.
ETA: This isn’t necessarily something that a person would have to be consciously aware of in order for it to happen, and it can also be based on a person’s assumptions about X/Y/Z-like standards if a given group doesn’t make them explicit.
Adelene’s response strikes me as a similar experience. I should also admit that I’m having a lot of trouble actually getting a concrete description of the experience, as it’s primarily emotional/subconscious, but here’s my own go at it:
I suppose the short version is that while I have the social/emotional response of “belonging and acceptance”, I don’t actually feel safe relaxing and letting down my guard around those groups, which produces a secondary emotional response of feeling alienated and uncomfortable that I have to keep those defenses up.
There are various social behaviors that groups will exhibit to build a very strong “sense of belonging”, and it’s more an emotional evaluation than an intellectual one—although the other part is that I often 99% fit with a group, am clearly a valuable member of the group, and risk getting expelled if I reveal that other 1% of myself.
More specifically, I belong to a few groups where revealing one’s status can still result in fairly sharp social ostracization . Thus, once I’ve found a group where I “belong”, I run in to the choice of risking all of that to be accepted “for who I really am”, or just shutting up and keeping quiet about things that almost never come up anyways.
In the case of LessWrong, I feel safe because the community strikes me as much more likely to be tolerant of these things, because an online community has much less power to hurt me, and because these things are extremely unlikely to come up here to begin with (and, being an online forum, I can devote time to carefully crafting posts not to reveal anything; that’s still annoying, but gets written off as “I don’t want to post publicly about this” rather than “LessWrong is unsafe”)
The other aspect is simply that a lot of standard recruitment/retention techniques trigger a visceral aversion to me, even if I don’t view the group as a threat and genuinely do want to be a member.
I’ve got a streak of that, though of a different flavor. Some types of ceremonial efforts to solidify group cohesion don’t work for me, so I feel alienated from any group where there’s an assumption that I’ll feel good and devoted because of enforced symbolism.
To be less abstract about it all, I’m American, whatever that means. I can be defensive and even mildly jingoistic about America (though I consider the latter a failing)-- but I’d be a lot more comfortable with the place and the identity if it weren’t for all the damned flags.
In other news, I’ve been wondering lately whether it would be closer to the truth if, instead of thinking of myself as Jewish (ethnically), it would be better to frame it as “People kept telling me I was Jewish until I started believing it”.
I can be defensive and even mildly jingoistic about America
The US has one of the most effective brainwashing systems in the (first) world, patriotism-wise. I suspect that a part of it is the historical narrative of a real or imagined success against formidable odds, all in the last 200 years or so. The message “America is great” is also constantly all over the school system and the media. This is really hard to resist, no matter how often you repeat to yourself “I ought to keep my identity small”.
I’ve been wondering lately whether it would be closer to the truth if, instead of thinking of myself as Jewish (ethnically), it would be better to frame it as “People kept telling me I was Jewish until I started believing it”.
I heard that sentiment many times, not necessarily from people of Jewish descent, although the latter are an easy example. Jews in the early 20th century Germany thought if themselves as Germans, until “real Germans” disabused them of that notion in 1930s. Same happened in Russia in 1950s. Various Yugoslavian ethnicities suddenly realized in 1990s that they were not just Yugoslavians, but Serbs, Albanians, Croatians etc., and those who did not were quickly and forcefully reminded of it by their neighbors.
I somewhat relate to his comment, and for me it’s because of how much persona, holding myself back, and not letting myself go it requires to be accepted by others. When, and if, it actually does work, it feels like here all I was trying to do was be a nice guy, and now the ruse worked? Now it’s like you’ve committed yourself to it.
“You probably have a minor mood disorder from lack of satisfactory social interaction” seems like a rather harsh description of the members of this community. What data generated that thought?
I agree with the description. Why? Because the joy people describe at going to the meetups seems out of proportion to what goes on in the meetups—unless, as the old saying goes, hunger is the best spice.
I started with the assumption that most people posting here live alone or with a small immediate family and occasional interaction with acquaintances instead of as a part of a tightly knit tribe of some dozens of people who share their values and whom they have constant social interaction with. Then thought what the probable bias for site members to belong into a mainstream society tribe-equivalents like churches, sports fan groups, gangs or political organizations was.
The “mood disorder” thing is hyperbole for “your brain would like to be in a more tribe-like social environment than it is in now”, not an attempt at a clinical diagnosis.
You’re not as smart as you think, probably have a mild mood disorder from lack of satisfactory social interaction, and have no idea how you’ll subconsciously react to direct cult brainwashing techniques. Don’t mess with cults.
This is an important point. If you do mess with cults, start with the more innocuous ones before you face the heavy guns. Make sure you can resist the community in an average church before you test yourself against Scientology.
One of the impressive things about Sufism (at least as described by Idris Shah) is that they wouldn’t take people as students who didn’t already have social lives.
He might not. But things will be in his favor if you go in thinking knowing physics and science will make you impervious to the dark arts, without knowing a lot about psychology, cult and influence techniques and the messier stuff inside your own head.
(I’m not sure if you want to say something extra here by quoting a thing that was described as the “second most dangerous dark side meme” in the linked comment.)
But things will be in his favor if you go in thinking knowing physics and science will make you impervious to the dark arts, without knowing a lot about psychology, cult and influence techniques and the messier stuff inside your own head.
I wonder about this idea that knowing how someone will be manipulating you is any defense at all from being manipulated by that person. It sounds plausible, but is there any evidence at all that knowledge can have this affect?
Or is knowledge not wholly intellectual, and can be considered a species of manipulation, but not manipulation of the dark arts variety. Maybe even “light arts manipulation”? Sorry, had to throw this last paragraph in there because I thought it was interesting.
Compare “I’ve only known this guy for half an hour, but he seems really likable” and “I’ve only known this guy for half an hour, he’s been running through the tricks from the cult salesman playbook and is giving off a real likable impression at this point”.
You still need to have your own head game in order to actually counteract the subconscious impressions you are getting, but it will probably help to know that a contest is even happening.
I think what you say is plausible. But I also think that it is also plausible that a “likable impression” isn’t just an appearance, but the effect of you actually starting to like the guy. I think that’s the sort of thing that concerns me, that at a certain point our social instincts take over and we lose the ability to detach ourselves from the situation.
That’s a valid point. Women who have read about the pickup artist techniques report that the techniques still work on them even when they’re aware the person is using them. On the other hand, SWIM says that being aware of various techniques has helped him guard against HR methods on the basis of “Oh, now he’s moving into stage x, next he’s going to...”. SWIM would say that it depends to what degree you’re predisposed against the person using them.
Be aware that some techinques are more obvious than others. Some are really obvious when you know they exist, but also really obscure, so you won’t know they’re being used unless you’ve read about it before.
Interesting. My intuition and experience say this is screamingly, overtly incorrect. The fact that yours do not means I’m probably wrong—either about the ‘overtly’ or the ‘incorrect’!
Arguably, Internet culture has a tremendous amount of information on the dangers of Scientology in particular. (And I’m one of the people who put it there personally.) But you are entirely correct: people are convinced they’re much less manipulable than they are. I need to write something for LW on the subject (as I’ve been idly contemplating doing for about 6 months).
I would think the easiest method, albeit not terribly objective, would simply be to get someone who is fairly good at manipulation and play out scenarios with them. I’ve done this a few times as the manipulator, and it’s sort of scary how easily I can manipulate people in specific games, even when they know the rules and have witnessed some of my techniques.
If you do try it, I’ll comment that time and social pressure help me a lot in making people more pliable, too. I do these as a group exercise, so there’s a lot of peer pressure both to perform well, and not to use exactly the sort of “cheats” you should be using to resist manipulation. It’s also helped that I’ve always known the group and thus known how to tweak myself to hit specific weaknesses.
If you find something more useful than this, I’d love to hear it. I’ve merely learned I’m fairly good at manipulating—I have no clue how good I am at resisting :)
Having not tested them, I wouldn’t be sure. I tend to do best with people who are either following an easily inferred pattern (office workers, security, etc.) or people who I know personally, which would make it harder to do with someone I don’t know. You also are neither “disposable” (someone I’ll never deal with again) nor a friend, which adds a bit of social awkwardness.
Given that’s an entire paragraph of excuses, I suppose I should offer to try it anyway, if you want :)
For anyone wondering how this went, handoflixue failed to manipulate me into anything, in fact most of the successful manipulating was the other way around :-)
I have occasionally seen quizzes that purport to tell you how biased you are in purportedly relevant ways to cult susceptibility. I can’t say I found any of them revelatory, as, since you know what the test is testing, it’s way too easy to answer with the right answer rather than the true readout, even when you want the latter. I suppose proper testing would have to be similar to psychological measures of cognitive biases.
“Sure, I’ll correct it, even though people are obviously aware of [caricature of your idiotic warning].”
That is, accepting a correction with passive-aggressive jab at the dummy who pointed it out. [Note: edited comment several times, a reply might begin before the latest.]
I think you “hear” the comment in this tone because that’s how you would mean it if you wrote it. But to me, the tone seems reasonable, because when I place myself in lukeprog’s position I don’t imagine myself feeling any kind of aggression.
I don’t think I’m imagining the caricaturing, at least, and this is far from the first time I’ve seen lukeprog blame others anytime anyone mentions anything wrong with a post of his.
Also, this
I think you “hear” the comment in this tone because that’s how you would mean it if you wrote it.
So wait, you can know better what I was thinking, but I can’t know better what lukeprog was thinking?
Anyway, here are your links of the same thing going on:
1: Lukeprog metaphorically kicking and screaming when asked for clarification of a citation, then insulting those who would have found the answer “I just read the abstract” helpful.
2: Lukeprog directing me on fruitless searches of his citations, then, when that doesn’t work, equating his intuition with what his sources say, all to avoid admitting there might be some dissonance between his recommendations that he didn’t realize.
I didn’t want to make this a big referendum about a bad habit of Luke’s—I deleted mention of earlier occurrences from earlier posts so as not to widen the confrontation—but you asked for examples from the past.
I read the threads you linked, and my own assessment of them does not accord with yours. (Perhaps you will not be surprised by this.) This whole exchange and the ones you link have a tone I think of as “typical SilasBarta”: uncharitable and far more argumentative than necessary. It frustrates me because I find it tiring and unpleasant to read and/or participate in, and yet I recognize that you often have good insights that I will have to forgo if I want to avoid dealing with your style of interaction.
You don’t have to trust my judgment on this. See Tyrrell’s input on the first and warpforge’s in the second. Whatever I did or didn’t say, whatever tone I should or shouldn’t have been using, it should be clear that lukeprog’s response in both cases was to give knowably unhelpful replies and divert attention away from the proffered shortcoming, just as he’s doing here, which should satisfy your curiosity about why I would read him that way here.
If you really do think it’s okay to reply as lukeprog did here, when I would think you’d be the first person to criticize the tone of “okay I’ll fix it but I’m going to mock your concern”, then I’ll be sure to keep that in mind for my future interaction with you—but I doubt you actually think that.
This whole exchange and the ones you link have a tone I think of as “typical SilasBarta”: uncharitable and far more argumentative than necessary.
Indeed. I asked a simple question about the sources and didn’t get the simple answer until ~5 rounds of back-and-forth—that was way too much argumentativeness for what I was asking for! I’m glad you’re right on top criticizing Luke for that instead of me!
it should be clear that lukeprog’s response in both cases was to give knowably unhelpful replies and divert attention away from the proffered shortcoming, just as he’s doing here, which should satisfy your curiosity about why I would read him that way here. [emphasis added]
This is precisely the interpretation of lukeprog’s comments that I do not share, especially the bolded text.
Indeed. I asked a simple question about the sources and didn’t get the simple answer until ~5 rounds of back-and-forth...
Actually you got the answers directly [1][2], and, if the timestamps are to be trusted, before any back-and-forth (as JGWeissman noted).
So we’re at least agreed on the replies being knowably unhelpful then?
Actually you got the answers directly [1][2], and, if the timestamps are to be trusted, before any back-and-forth (as JGWeissman noted).
I didn’t get clarification that lukeprog was basing his characterization solely on the first two pages, and didn’t actually read the papers himself, until after the back-and-forth. So JGWeissman is wrong, I just didn’t bother re-re-explaining stuff to him at the time.
So we’re at least agreed on the replies being knowably unhelpful then?
That one is trickier. It depends on what you meant by “knowably”, that is, knowable by who in what state of information.
I didn’t get clarification… re-re-explaining...
I was going to try to dissect this, but rather than getting into the weeds of that exchange, I’ll just say that to me your position seems to be predicated on assertions you take to be obviously factual but that I believe to be uncharitable inferences on your part. At this point my endurance is giving out, so I’m going to leave the question of exactly which assertions I’m talking about as an exercise for the reader.
f you really do think it’s okay to reply as lukeprog did here, when I would think you’d be the first person to criticize the tone of “okay I’ll fix it but I’m going to mock your concern”, then I’ll be sure to keep that in mind for my future interaction with you—but I doubt you actually think that.
I wouldn’t actually be the first person to criticize that tone; I care much more about the effort to make the fix than the mockery. I’d rather the mockery not happen, of course, but for example, if you were to tell me, “I’m sorry that you find reading my arguments hurts your fee-fees, poor blossom; in the future I’ll make an effort to question my inferences about other people’s motivations and states of mind,” I’d totally let the former part of the statement slide in light of the latter.
Hmmm. Well, not the tone I intended. It literally did not occur to me that people would consider taking a Scientology course as a result of my post, but then I updated as a result of David’s comment, and that is why I added the disclaimer to the first paragraph. “Figured” in my comment is past tense on purpose.
Our brains can add in these tones when they feel certain ways without it being consciously available. Tough stuff to keep out of discourse, our language is geared toward opinionated conflict in any case.
That’s a fair point; conversely, there are entire websites (or so I’ve heard) dedicated to obvious warnings, and there are already people making fun of how obvious his warning is. So I’m thinking his pre-emption was pretty close to spot on.
Do you think that “Don’t take this Scientology course, which I just spent half the article praising with nary a bad word for Scientology?” falls into the class of obvious warnings? Also, lukeprog was caricaturing David’s argument.
Wow, so if I say yes, then what? Will we go back and forth for a hundred pages in a good old fashioned internet flame war? No thanks, I have better uses of my time. ;)
We know that scientology is bad, no one here’s in any doubt about their legitimacy or thinks they might be some cool people to hang out with; conversely that course is sounding pretty good, which is what he was praising. Complaining until he adds a warning on the end, saying we shouldn’t take it is pretty silly considering he obviously intends us to take the course or something similar to it.
And so what? He’s entitled to his opinion about scientology too, as well as their courses.
He’s not entitled to caricature people’s concerns though.
Also, it’s kind of interesting all the little details that trickled out afterward: “Oh, by the way, the place was deserted … and I had to practice on a 12 year old girl … and I had already been well-versed in what to expect and so had unusual resistance to their tricks...”
That’s his way of communicating, I took it as a joke personally.
If you’re suspecting that he’s a stooge for scientology, say it outright. I didn’t really think it was that strange that he mentioned the little details; not to mention that all of us here are pretty well versed in scientology by now.
Oh I agree it’s dangerous. The world is filled with dangerous ideas and pointy bits, we’re all adults here and can make our own decisions without child friendly warnings over everything.
True. Nevertheless I’ve always felt common sense to be a hazy subject. I’d prefer to use the words “personal judgement”. They can use their personal judgement ;) to prepare against the risks in order to get the benefits of the course. Or not. Because this stuff sounds pretty similar to what beginner PUAs are taught, those guys hold courses too, although you might end up paying way more.
I don’t think he’s a stooge, not at all. I think, however, after reviewing the exchange and David Gerard’s input, that he lacked a sort of awareness of what was going on, and didn’t appreciate the dangers others would have in his position.
FWIW, I did read his initial article as, “Go take this Scientology course—the exercises are great, just don’t get sucked into the religion.” Which is a much weaker warning than he now gives.
Oh, you mean I should make it clear that Scientology is dangerous and people shouldn’t take Scientology classes? I figured that would be obvious, but okay: I added it to the post.
I think your disclaimer looks too much like an implicit challenge: “I dabbled with Scientology classes but didn’t get hooked because I’m that rational/self-disciplined/awesome; but you shouldn’t try it because you’re probably not as awesome, and you might get reeled in.”
The real history of the disclaimer, though, is more like, “I dabbled and didn’t get hooked because I’m awesome, and I didn’t warn you about it at first because I think you’re awesome, but David Gerard thinks otherwise and he twisted my arm.”
For my part, I appreciated having my awesomeness recognized, however briefly. It’s not every day that other people notice that about me. :)
I am in fact just a big meanie about this stuff. “Dad just won’t let me get into the really good mind controlling, he’s so oppressive. Where are my Sea Org teenage minions? This is sooo bogus.”
If in 25 years any of your kids run an international cult I’m blaming you.
The daughter will be the next Dark Lord. The girlfriend will be running the cult.
You’re not my real dad!
I work sixteen hours a day keeping the Dutch from invading and this is the thanks I get? That’s IT. My ocean, my rules. You are GROUNDED, young colonies!
You are awesome.
I’m not convinced “p.s.: don’t do this thing that worked out really well for me and I shall now describe in thrilled detail” entirely makes it no longer functionally a personal recommendation, but it’s possibly better than nothing. Thank you.
Yes but LessWrong is a lot like this—witness all the discussions in thrilled detail of drugs that put your brain into a more effective/enjoyable state. It’s assumed that the readership is intelligent/responsible enough to handle this sort of thing.
The desire to succeed in unorthodox ways (“cheat” at life) is strong in many members of this community—Luke’s Scientology story fits that pattern very well. It certainly makes me want to try a com course and I’ve read about Scientology in endless detail—including some of your work.
Sewer-diving could be fun, and instructive! But a note or few about adequate preparation first strikes me as a really good idea. Particularly when the story turns out to be “and then I swallowed this sample of engineered resistant mycobacterium tuberculosis, and I felt great.” Hubris is one of the dangers of a little knowledge.
Sewer-diving is, in fact, fun and safe for humans, and your warnings about the dangers are alarmist and excessive.
Scientology classes are also safe.
How did you come to the conclusion that this was a good comment to post?
How did you come to the conclusion that the parent of the comment containing this sentence was a good comment to post?
Are you attempting to direct me on an endlessly-recurring chain of justification? At some point, reflection must stop and action must be taken, or else you will use up all free energy and entropize just thinking of your next action. Correct reasoning teaches you this very quickly.
By heuristic based processing, as with how I do most things. It seems reasonable to assume that the same isn’t true of you, though, so I expected a rather more useful answer to my question. (Relevant heuristics include ‘if confused, ask for information’ and ‘alert friend-type people to mistakes so that they can avoid those mistakes in the future’.)
I wasn’t, actually. I suspect that whatever system you used to decide to make that post is poorly calibrated, and intended to offer help in debugging it. It’s also possible that my model of you is not as accurate as it could be, and that’s what needs debugging. In either case, gathering more information is a reasonable early step in the process.
I also use heuristic reasoning, (governed by the meta-heuristic of correct reasoning), and here I thought that User:David_Gerard was significantly overstating the risks of sewer-diving and Scientology classes for humans. Therefore, I added my “independent component” to the discussion.
Sewer diving is in fact a favourite of urban explorers. And I must admit that trolling Scientology in my dissolute youth was lots of fun :-D
You shouldn’t troll groups, even if you deem them evil and dangerous, for much the same reason that you shouldn’t (EDIT: previous post had “should”) murder their members.
The outside culture has enough warnings about dangers of using drugs that we don’t have to repeat them here. Everybody knows that playing with them can fry your brain, and you should take proper precautions. I don’t think the outside culture has enough warnings about psychological manipulation techniques in general, nor this particular sect. People routinely think they’ll be less influenced than they are.
And there’s also the thing that while the people who hang around at LW probably have more ammo than usual against the overt bullshit of cults, they also might have some traits that make them more susceptible to cult recruitment. Namely, sparse social networks, which makes you vulnerable to a bunch of techniques that create the feeling of belonging and acceptance of the new community, and tolerance of practices and ideas outside the social mainstream, which gets cult belief systems that don’t immediately trigger bullshit warnings inside your head.
The Aum Shinrikyo cult in Japan that did the subway sarin gas thing reportedly recruited lots of science and engineering students. An engineering mindset will also keep you working from the internalized bullshit against social proof, since science and engineering is a lot about about how weird stuff extrapolated beyond conventional norms works and gives results.
tl;dr: You’re not as smart as you think, probably have a mild mood disorder from lack of satisfactory social interaction, and have no idea how you’ll subconsciously react to direct cult brainwashing techniques. Don’t mess with cults.
How about a word on the major religions? The most obvious difference between a cult and a religion is that the religion is many orders of magnitude more successful at recruitment—which is the very thing that we are being warned about with respect to cults.
Parasite species that have been around a long time have mostly evolved not to kill their host very fast. With new species, all bets are off.
The Mormons are a good comparison. They were dangerous lunatics in the mid-1800s—and Brigham Young was a murderous nutter on a par with David Miscavige. These days, they’re slightly weirdy but very nice (if very, very conservative) people; good neighbours.
You must mean “kill off” metaphorically, since I don’t recall any incidents in which Scientology has killed off Scientologitsts. In contrast I can recall many very recent incidents in which one old religion—Islam—has killed off adherents. But if “kill off” is a metaphor, then what is the literal danger from Scientology which is being referred to metaphorically as “kill off the host”?
http://en.wikipedia.org/wiki/Death_of_Lisa_McPherson—and she was hardly the first.
I would caution against using “I don’t recall” to mean “I haven’t researched even slightly”.
I used “I don’t recall” to mean “I don’t recall”. Go ahead and bash me for failing to research the question but please don’t put your words and ideas in my writing.
I think David’s point is that when you say “I don’t recall X”, it matters very much whether you would recall an X to begin with, i.e., whether P(“I recall X” | X has happened) is significantly larger than P(“I don’t recall X” | X has happened). So when you offer up “I don’t recall X”, people assume you’re doing it because the former is larger than the latter.
But if that’s not the case, then you are, in effect, using “I don’t recall” to mean “I haven’t researched”, and this is why David was accusing you of blurring the distinction.
No, you’re inventing my meaning on the basis of a convoluted reading, and you’re neglecting the context. What I said was that I do not recall. And that is true. In context, the issue is whether Scientology kills off its host quickly. I pointed out that Islam, which kills many of its own adherents, is classified (by the preceding comment, implicitly) as not killing off its host quickly. Therefore for Scientology to be classified as killing off its host quickly it must kill more of its own adherents than Islam does. So that is the relevant question.
So: how does David’s evidence address this question? Not very well. A woman died from negligence while in the care of co-religionists. This can barely be connected to the religion itself. When I said that Islam kills off many of its own adherents, I did not have in mind adherents dying from negligence while in the care of co-religionists. I had in mind jihad. But if we want to expand the definition of killing one’s own host, let us do so: let us take into account the economic backwardness caused by Islam in the Middle East. That should greatly increase the death toll of Islam. Which does not, by assumption, kill of its host.
So, David’s evidence is hardly pertinent to the question. If we expand the definition of killing of one’s host to accommodate it, then we must do the same for Islam, which makes Islam look very bad indeed.
Now let’s turn to my own evidence. I am an imperfect observer, who is not aware of everything that goes on in the world. But it doesn’t matter whether I am perfect. What matters is whether I’m biased. David says that I did not specially investigate Scientology. No, I didn’t. And also, I didn’t specially investigate Islam. So as an instrument, I am balanced in that respect. And my readout says: I am aware of many dead from Islam, none dead from Scientology. David says I missed one. Oh? And so what? I missed many on the Muslim side too.
The probability of being Muslim is a lot higher (about 1000 times more?) than of being Scientologist, so I presume you’re talking about how many incidents you’d expect to have heard about per capita.
I wish that you were either a more concise or less interesting writer, so that I wouldn’t waste time reading a detailed argument about what’s-been-said.
That’s one adjustment that needs to be made, though not the only one. The other major adjustment that needs to be made is for proximity. That goes in the opposite direction. But it’s not worthwhile thinking about it with current data—the energy should be spent on getting better data. I just did that for Afghanistan. 38,000 is the most recent figure I found for dead Taliban, who I interpret as seeing themselves as fighting Islamic jihad, seeing as the Taliban is an Islamic theocracy. Divide that by 1000 and you have 38 Scientologists who, going by your figure, need to have died in armed struggle with—I don’t know—the police, maybe, in order for Scientology to match the proportional death toll in religious violence. I’m pretty sure that if 38 Scientologists had died fighting the police, I would have heard about it, even though I didn’t specifically research the question.
And the Taliban is I think just a small part of everyone who died in the last decade in what they considered to be Islamic jihad.
Update—the factor of 1000 is way off. It’s 25,000 Scientologists versus about 1.5 billion Muslims. If, say, 150,000 Muslims have died in armed jihad in the past ten years, then that’s one in 10,000, which comes to two Scientologists who need to die in armed struggle to match proportions. So being a Muslim is probably not significantly more dangerous than being a Scientologist. Further information could reveal that it is less dangerous.
I also assumed there were far more than 25k Scientologists.
Yeah, I’m surprised too. I’m basing 25k on this.
The membership statistics have been lies for decades. alt.religion.scientology worked out it was around 50k in the late 1990s; I’m surprised it’s as high as 25k now.
I agree that David’s point about Lisa McPherson isn’t counterevidence to the claim you made (or rather, were implying based on not recalling). I was replying only to your statement
which was ridiculing the very idea that someone would read your “I don’t recall” to mean “I don’t recall and that is informative in this case”, when people have good reason to do so, as I explained.
If your objection to David’s point was that the McPherson case is not evidence of Scientology “killing off its host”, then you should have said so in your reply at that point (and I would have agreed) rather than merely flaunt your non-standard usage of “I don’t recall” and insult the people who thought you were trying to say something relevant.
You have it backwards. I asked someone not to put words into my mouth. There were much better and less rude ways for him to make the same point. I am not going to continue arguing this at length because Jonathan Graehl just said he doesn’t like being forced to read who-said-what.
Ruin their life or mess them up mentally.
Check out Auditing Procedure R2-45. There are also a number of less formal murders attributed to them. Ask Google for “Scientology Murder”.
Please do not use value-laden and unsupported terms such as “murder” here. Yes, there are some cases of controversial deaths involving Scienology, but none of these could be described as murder of either the formal or less-formal sort.
The existence of R2-45 is rather unsettling, but apparently this ‘auditing procedure’ has never been enacted.
Okay, edited to use the less value-laden term “exteriorization”.
Growth/attrition rates are actually the thing to look at here. Scientology is faster-growing than just about any other modern religion, though the attrition rate is also very high. In order to figure out virulency, figure out what population the S-curve of members of that religion will top out at. If growth is slowing, you’re almost there. If growth is steady, you’re about halfway there. If growth is exponential or approximately so, you’re looking at a religion in its infancy.
This has of course been covered here before (with reference to this and this).
Umm. Not all of us. I may be vulnerable to cults for other reasons, namely my conformist personality, but not lack of people to talk to.
Oddly, a “sense of belonging” usually makes me feel alienated and uncomfortable. It’s the rare exceptions like LessWrong, where it actually feels like I do fit, and am being challenged and growing and free to express myself, that avoid that.
This sounds very odd. In fact, it sounds oxymoronic. Can you explain?
I can take a shot at it, having experienced something similar.
The general situation usually follows the pattern of “There is a group with easily-noticeable standards A, B, and C and less-easily-noticeable standards X, Y, and Z. I conform to A, B, and C (though probably for different reasons than they do), but not to (some subset of) X , Y, and Z, but since X and Y and Z don’t come up very often, 1) they haven’t figured out that I don’t fit them, and 2) I didn’t realize that those standards were significant until after I’d been accepted as a member of the group (which is where the ‘sense of belonging’ comes in). At no point did I actually mislead the group with regards to X, Y, or Z, but it’s very likely that if they find out that I don’t conform to them, they will assume that I did and there will be large amounts of drama.”
This usually leads to an inclination to hide facts relating to X, Y, and Z, which feels from the inside like being alienated and uncomfortable.
ETA: This isn’t necessarily something that a person would have to be consciously aware of in order for it to happen, and it can also be based on a person’s assumptions about X/Y/Z-like standards if a given group doesn’t make them explicit.
Adelene’s response strikes me as a similar experience. I should also admit that I’m having a lot of trouble actually getting a concrete description of the experience, as it’s primarily emotional/subconscious, but here’s my own go at it:
I suppose the short version is that while I have the social/emotional response of “belonging and acceptance”, I don’t actually feel safe relaxing and letting down my guard around those groups, which produces a secondary emotional response of feeling alienated and uncomfortable that I have to keep those defenses up.
There are various social behaviors that groups will exhibit to build a very strong “sense of belonging”, and it’s more an emotional evaluation than an intellectual one—although the other part is that I often 99% fit with a group, am clearly a valuable member of the group, and risk getting expelled if I reveal that other 1% of myself.
More specifically, I belong to a few groups where revealing one’s status can still result in fairly sharp social ostracization . Thus, once I’ve found a group where I “belong”, I run in to the choice of risking all of that to be accepted “for who I really am”, or just shutting up and keeping quiet about things that almost never come up anyways.
In the case of LessWrong, I feel safe because the community strikes me as much more likely to be tolerant of these things, because an online community has much less power to hurt me, and because these things are extremely unlikely to come up here to begin with (and, being an online forum, I can devote time to carefully crafting posts not to reveal anything; that’s still annoying, but gets written off as “I don’t want to post publicly about this” rather than “LessWrong is unsafe”)
The other aspect is simply that a lot of standard recruitment/retention techniques trigger a visceral aversion to me, even if I don’t view the group as a threat and genuinely do want to be a member.
I’ve got a streak of that, though of a different flavor. Some types of ceremonial efforts to solidify group cohesion don’t work for me, so I feel alienated from any group where there’s an assumption that I’ll feel good and devoted because of enforced symbolism.
To be less abstract about it all, I’m American, whatever that means. I can be defensive and even mildly jingoistic about America (though I consider the latter a failing)-- but I’d be a lot more comfortable with the place and the identity if it weren’t for all the damned flags.
In other news, I’ve been wondering lately whether it would be closer to the truth if, instead of thinking of myself as Jewish (ethnically), it would be better to frame it as “People kept telling me I was Jewish until I started believing it”.
The US has one of the most effective brainwashing systems in the (first) world, patriotism-wise. I suspect that a part of it is the historical narrative of a real or imagined success against formidable odds, all in the last 200 years or so. The message “America is great” is also constantly all over the school system and the media. This is really hard to resist, no matter how often you repeat to yourself “I ought to keep my identity small”.
I heard that sentiment many times, not necessarily from people of Jewish descent, although the latter are an easy example. Jews in the early 20th century Germany thought if themselves as Germans, until “real Germans” disabused them of that notion in 1930s. Same happened in Russia in 1950s. Various Yugoslavian ethnicities suddenly realized in 1990s that they were not just Yugoslavians, but Serbs, Albanians, Croatians etc., and those who did not were quickly and forcefully reminded of it by their neighbors.
I somewhat relate to his comment, and for me it’s because of how much persona, holding myself back, and not letting myself go it requires to be accepted by others. When, and if, it actually does work, it feels like here all I was trying to do was be a nice guy, and now the ruse worked? Now it’s like you’ve committed yourself to it.
“You probably have a minor mood disorder from lack of satisfactory social interaction” seems like a rather harsh description of the members of this community. What data generated that thought?
I agree with the description. Why? Because the joy people describe at going to the meetups seems out of proportion to what goes on in the meetups—unless, as the old saying goes, hunger is the best spice.
I started with the assumption that most people posting here live alone or with a small immediate family and occasional interaction with acquaintances instead of as a part of a tightly knit tribe of some dozens of people who share their values and whom they have constant social interaction with. Then thought what the probable bias for site members to belong into a mainstream society tribe-equivalents like churches, sports fan groups, gangs or political organizations was.
The “mood disorder” thing is hyperbole for “your brain would like to be in a more tribe-like social environment than it is in now”, not an attempt at a clinical diagnosis.
This is an important point. If you do mess with cults, start with the more innocuous ones before you face the heavy guns. Make sure you can resist the community in an average church before you test yourself against Scientology.
One of the impressive things about Sufism (at least as described by Idris Shah) is that they wouldn’t take people as students who didn’t already have social lives.
In other words “don’t try to argue with the devil^H^H Scientologist—he has more experience at it than you”.
He might not. But things will be in his favor if you go in thinking knowing physics and science will make you impervious to the dark arts, without knowing a lot about psychology, cult and influence techniques and the messier stuff inside your own head.
(I’m not sure if you want to say something extra here by quoting a thing that was described as the “second most dangerous dark side meme” in the linked comment.)
I do believe you’ve nailed it. Well done, sir.
I wonder about this idea that knowing how someone will be manipulating you is any defense at all from being manipulated by that person. It sounds plausible, but is there any evidence at all that knowledge can have this affect?
Or is knowledge not wholly intellectual, and can be considered a species of manipulation, but not manipulation of the dark arts variety. Maybe even “light arts manipulation”? Sorry, had to throw this last paragraph in there because I thought it was interesting.
Compare “I’ve only known this guy for half an hour, but he seems really likable” and “I’ve only known this guy for half an hour, he’s been running through the tricks from the cult salesman playbook and is giving off a real likable impression at this point”.
You still need to have your own head game in order to actually counteract the subconscious impressions you are getting, but it will probably help to know that a contest is even happening.
I think what you say is plausible. But I also think that it is also plausible that a “likable impression” isn’t just an appearance, but the effect of you actually starting to like the guy. I think that’s the sort of thing that concerns me, that at a certain point our social instincts take over and we lose the ability to detach ourselves from the situation.
That’s a valid point. Women who have read about the pickup artist techniques report that the techniques still work on them even when they’re aware the person is using them. On the other hand, SWIM says that being aware of various techniques has helped him guard against HR methods on the basis of “Oh, now he’s moving into stage x, next he’s going to...”. SWIM would say that it depends to what degree you’re predisposed against the person using them.
Be aware that some techinques are more obvious than others. Some are really obvious when you know they exist, but also really obscure, so you won’t know they’re being used unless you’ve read about it before.
Interesting. My intuition and experience say this is screamingly, overtly incorrect. The fact that yours do not means I’m probably wrong—either about the ‘overtly’ or the ‘incorrect’!
Arguably, Internet culture has a tremendous amount of information on the dangers of Scientology in particular. (And I’m one of the people who put it there personally.) But you are entirely correct: people are convinced they’re much less manipulable than they are. I need to write something for LW on the subject (as I’ve been idly contemplating doing for about 6 months).
Do you know of any techniques to measure your own manipulability somewhat objectively?
I would think the easiest method, albeit not terribly objective, would simply be to get someone who is fairly good at manipulation and play out scenarios with them. I’ve done this a few times as the manipulator, and it’s sort of scary how easily I can manipulate people in specific games, even when they know the rules and have witnessed some of my techniques.
If you do try it, I’ll comment that time and social pressure help me a lot in making people more pliable, too. I do these as a group exercise, so there’s a lot of peer pressure both to perform well, and not to use exactly the sort of “cheats” you should be using to resist manipulation. It’s also helped that I’ve always known the group and thus known how to tweak myself to hit specific weaknesses.
If you find something more useful than this, I’d love to hear it. I’ve merely learned I’m fairly good at manipulating—I have no clue how good I am at resisting :)
That reminds me of a bit from a book about art forgery—that need, greed, and speed make people more gullible.
I’d love to try this (being the manipulatee). Do your mind tricks work over Skype?
Having not tested them, I wouldn’t be sure. I tend to do best with people who are either following an easily inferred pattern (office workers, security, etc.) or people who I know personally, which would make it harder to do with someone I don’t know. You also are neither “disposable” (someone I’ll never deal with again) nor a friend, which adds a bit of social awkwardness.
Given that’s an entire paragraph of excuses, I suppose I should offer to try it anyway, if you want :)
Good! How about Sunday evening (CEST)?
CEST = UTC+2, correct? I’m PST (UTC-7), so that’d put you 9 hours ahead of me.
My 10 AM would be your 7 PM—would that work for you? I can do a couple hours later if not.
EDIT: I’m assuming Sunday, May 1st, 2011. If you could send me your Skype name that will probably also make this much easier :)
For anyone wondering how this went, handoflixue failed to manipulate me into anything, in fact most of the successful manipulating was the other way around :-)
I have occasionally seen quizzes that purport to tell you how biased you are in purportedly relevant ways to cult susceptibility. I can’t say I found any of them revelatory, as, since you know what the test is testing, it’s way too easy to answer with the right answer rather than the true readout, even when you want the latter. I suppose proper testing would have to be similar to psychological measures of cognitive biases.
I wish you wouldn’t take this tone when agreeing to people’s helpful suggestions :-/
Which tone?
“Sure, I’ll correct it, even though people are obviously aware of [caricature of your idiotic warning].”
That is, accepting a correction with passive-aggressive jab at the dummy who pointed it out. [Note: edited comment several times, a reply might begin before the latest.]
I think you “hear” the comment in this tone because that’s how you would mean it if you wrote it. But to me, the tone seems reasonable, because when I place myself in lukeprog’s position I don’t imagine myself feeling any kind of aggression.
I don’t think I’m imagining the caricaturing, at least, and this is far from the first time I’ve seen lukeprog blame others anytime anyone mentions anything wrong with a post of his.
Also, this
was not the basis for the evaluation I made.
...as far as you are aware.
I detect that I might need to update. Links?
Though this seems to be a matter of your introspection versus SilasBarta’s, right?
Yep. I don’t claim knowledge of lukeprog’s actual mental state when he made the comment.
I mean, your respective introspections regarding SilasBarta’s mental state / processes.
Oh, I see. No, I just intended to express the by-now banal notion that people in general aren’t good at knowing why they think what they think.
...as far as I am aware.
So wait, you can know better what I was thinking, but I can’t know better what lukeprog was thinking?
Anyway, here are your links of the same thing going on:
1: Lukeprog metaphorically kicking and screaming when asked for clarification of a citation, then insulting those who would have found the answer “I just read the abstract” helpful.
2: Lukeprog directing me on fruitless searches of his citations, then, when that doesn’t work, equating his intuition with what his sources say, all to avoid admitting there might be some dissonance between his recommendations that he didn’t realize.
I didn’t want to make this a big referendum about a bad habit of Luke’s—I deleted mention of earlier occurrences from earlier posts so as not to widen the confrontation—but you asked for examples from the past.
I read the threads you linked, and my own assessment of them does not accord with yours. (Perhaps you will not be surprised by this.) This whole exchange and the ones you link have a tone I think of as “typical SilasBarta”: uncharitable and far more argumentative than necessary. It frustrates me because I find it tiring and unpleasant to read and/or participate in, and yet I recognize that you often have good insights that I will have to forgo if I want to avoid dealing with your style of interaction.
You don’t have to trust my judgment on this. See Tyrrell’s input on the first and warpforge’s in the second. Whatever I did or didn’t say, whatever tone I should or shouldn’t have been using, it should be clear that lukeprog’s response in both cases was to give knowably unhelpful replies and divert attention away from the proffered shortcoming, just as he’s doing here, which should satisfy your curiosity about why I would read him that way here.
If you really do think it’s okay to reply as lukeprog did here, when I would think you’d be the first person to criticize the tone of “okay I’ll fix it but I’m going to mock your concern”, then I’ll be sure to keep that in mind for my future interaction with you—but I doubt you actually think that.
Indeed. I asked a simple question about the sources and didn’t get the simple answer until ~5 rounds of back-and-forth—that was way too much argumentativeness for what I was asking for! I’m glad you’re right on top criticizing Luke for that instead of me!
This is precisely the interpretation of lukeprog’s comments that I do not share, especially the bolded text.
Actually you got the answers directly [1][2], and, if the timestamps are to be trusted, before any back-and-forth (as JGWeissman noted).
So we’re at least agreed on the replies being knowably unhelpful then?
I didn’t get clarification that lukeprog was basing his characterization solely on the first two pages, and didn’t actually read the papers himself, until after the back-and-forth. So JGWeissman is wrong, I just didn’t bother re-re-explaining stuff to him at the time.
That one is trickier. It depends on what you meant by “knowably”, that is, knowable by who in what state of information.
I was going to try to dissect this, but rather than getting into the weeds of that exchange, I’ll just say that to me your position seems to be predicated on assertions you take to be obviously factual but that I believe to be uncharitable inferences on your part. At this point my endurance is giving out, so I’m going to leave the question of exactly which assertions I’m talking about as an exercise for the reader.
I wouldn’t actually be the first person to criticize that tone; I care much more about the effort to make the fix than the mockery. I’d rather the mockery not happen, of course, but for example, if you were to tell me, “I’m sorry that you find reading my arguments hurts your fee-fees, poor blossom; in the future I’ll make an effort to question my inferences about other people’s motivations and states of mind,” I’d totally let the former part of the statement slide in light of the latter.
No, I can’t. Hence my need to update. Thanks for the links, by the way.
Hmmm. Well, not the tone I intended. It literally did not occur to me that people would consider taking a Scientology course as a result of my post, but then I updated as a result of David’s comment, and that is why I added the disclaimer to the first paragraph. “Figured” in my comment is past tense on purpose.
Our brains can add in these tones when they feel certain ways without it being consciously available. Tough stuff to keep out of discourse, our language is geared toward opinionated conflict in any case.
When people suggest changes, they’re not saying you’re a failure, and you needn’t suggest flaws on their part as if they were.
That’s a fair point; conversely, there are entire websites (or so I’ve heard) dedicated to obvious warnings, and there are already people making fun of how obvious his warning is. So I’m thinking his pre-emption was pretty close to spot on.
Do you think that “Don’t take this Scientology course, which I just spent half the article praising with nary a bad word for Scientology?” falls into the class of obvious warnings? Also, lukeprog was caricaturing David’s argument.
Wow, so if I say yes, then what? Will we go back and forth for a hundred pages in a good old fashioned internet flame war? No thanks, I have better uses of my time. ;)
We know that scientology is bad, no one here’s in any doubt about their legitimacy or thinks they might be some cool people to hang out with; conversely that course is sounding pretty good, which is what he was praising. Complaining until he adds a warning on the end, saying we shouldn’t take it is pretty silly considering he obviously intends us to take the course or something similar to it.
And so what? He’s entitled to his opinion about scientology too, as well as their courses.
He’s not entitled to caricature people’s concerns though.
Also, it’s kind of interesting all the little details that trickled out afterward: “Oh, by the way, the place was deserted … and I had to practice on a 12 year old girl … and I had already been well-versed in what to expect and so had unusual resistance to their tricks...”
That’s his way of communicating, I took it as a joke personally.
If you’re suspecting that he’s a stooge for scientology, say it outright. I didn’t really think it was that strange that he mentioned the little details; not to mention that all of us here are pretty well versed in scientology by now.
I don’t think he’s in any way a stooge. I do think he’s got hazardous levels of hubris and I do think his post was a danger to others.
Oh I agree it’s dangerous. The world is filled with dangerous ideas and pointy bits, we’re all adults here and can make our own decisions without child friendly warnings over everything.
If common sense were comparatively robust against mind-control techniques, they wouldn’t be mind-control techniques.
True. Nevertheless I’ve always felt common sense to be a hazy subject. I’d prefer to use the words “personal judgement”. They can use their personal judgement ;) to prepare against the risks in order to get the benefits of the course. Or not. Because this stuff sounds pretty similar to what beginner PUAs are taught, those guys hold courses too, although you might end up paying way more.
I don’t think he’s a stooge, not at all. I think, however, after reviewing the exchange and David Gerard’s input, that he lacked a sort of awareness of what was going on, and didn’t appreciate the dangers others would have in his position.
FWIW, I did read his initial article as, “Go take this Scientology course—the exercises are great, just don’t get sucked into the religion.” Which is a much weaker warning than he now gives.
“Sure, I’ll correct it, even though people are obviously aware of [caricature of your (implied) idiotic warning].”