each theist has a certain personal sequence of Dark Arts-ish levers in eir head, the flipping (or un-flipping) of which would snap em out of faith.
This seems like an extremely strong statement and thus hardly believable. Many people would dearly love to discover such powerful secrets. Feel free to share an example or two.
It’s very believable. I’ll give a couple of techniques here.
Reinforce skeptical behavior while modifying their self-image to that of a truth-seeker: “I love talking to you because you pursue the truth over comforting lies”. Be genuine, and by that I mean use the tone you would use to tell someone that their suit looks good.
Give high-status cues. Assume the role of teacher or mentor. Once they want to become more like you, merely expressing your beliefs (not opposing theirs, but expressing yours) will make a significant impact.
Demonstrate that giving up religious belief won’t result in isolation. How exactly you do this will vary based on the social context.
There are a couple others, such as generating low-status associations with religion, which is a bit advanced and so not worth covering here, and creating false memories and comittments, which is scarily easy to do but absolutely dark arts and therefore not covered here.
You will note the absence of “rational argument” on this list. That’s because rational argument is rather ineffective for changing the mind of the person you are arguing with (though it may change the views of observers).
Yes. To be honest, I suspect I only hastened the process by a year or two though (also ended up giving a crash course in evolutionary bio and physics, which helped me understand both subjects much better).
I have recently had the unpleasant experience of getting subjected to the kind of dishonest emotional manipulation that is recommended here. A (former) friend tried to convert me to his religion by using these tricks, and I can attest that they are effective if the person on the receiving end is trusting enough and doesn’t realize that they are being manipulated. In my case the absence and avoidance of rational argument eventually led to the failure of the conversion attempt, but not before I had been inflicted severe emotional distress by a person I used to trust.
Needless to say, I find it unpleasant that these kind of techniques are mentioned without also mentioning that they are indeed manipulative, dishonest and very easy to abuse.
Does LessWrong have an actual primer on the Dark Arts anywhere? There’s a lot of discussion of Defense Against, but I haven’t seen any Practice Of… Perhaps that’s beyond the scope of what we really intend to teach here?
The last of those 3 (mine), at least, is in the process of being developed. I’m still mostly focusing on reading the relevant literature. I have the rough draft of 3 posts, but since it looks like there will be 10 to 15 of them plus a large post of miscellaneous techniques of influence, I am not posting yet (I will probably reorganize before I post).
I would be interested in knowing what resources you used for this sequence.
As an autist there is a huge swath of innate skills ‘normal’ people possess I can only emulate. Social success for me is indistinguishable from Dark Art skill.
To start with, I would recommend (in the following order)
Thinking Fast and Slow- Kahneman and Tversky
Influence: Science and Practice—Cialdini
How We Decide- Lehrer
How to Win Friends and Influence People—Carnegie
Nudge: Thaler and Sunstein
Cialdini and Carnegie have a bad habit of not citing sources, so you may want to take any unsubstantiated claims with a grain of salt.
This list is not comprehensive. If anyone else would like to add some recommendations for books or particularly informative studies, I would definitely appreciate it.
In addition to reading, experience in dealing with people is very important for things like this. If you are not currently employed, I would recommend getting a job in sales. This will give you a chance to practice and experiment in a relatively safe environment. Additionally, I have heard that unusual behavior is more accepted in bars, so that might be worth looking into (I’m under 21 and live in America, so that is not really an option for me. As such, bear in mind that this is secondhand advice). Finally, if you are particularly skilled in some subject area, you may want to consider tutoring. In addition to bringing in money and helping someone else, this will allow you to experience being in a high-status situation.
Once again, the list of recommended experiences is not comprehensive. I would welcome any additional suggestions.
Discussion in the comments of this post, in which I perceived Luke as heartily recommending skinny-dipping in sewers for self-improvement purposes. “And then I swallowed this sample of engineered resistant mycobacterium tuberculosis, and I felt great! Not that you should do that or anything.”
“each theist” is the part of the claim that is too strong, since it would include, among others, the Pope, Mother Teresa and Osama bin Ladin. I grant that some techniques do work on some theists (and atheists).
True, if you want to be pedantic about it. In fact, they probably wouldn’t work on most theists in high-status positions. Think about how often you hear about someone “finding God/Allah/Jesus” at a low point of their life when they feel themselves to be failures. Now consider how often someone high-status changes their beliefs.
Didn’t see this! You’re right, that is quite a bit too strong. Let me reduce the strength of that statement: Among theists to whom I have become close enough to ask deeply personal questions and expect truthful answers, such levers seem prevalent.
It’s my blog. I think I can for the large fraction of atheists that got there by social pressure alone (at least for a month or so), but people that actually understand why atheism is the right answer would be tougher. I’m curious if I could break them too, but that’s way too evil for my tastes.
The techniques don’t cleave down the lines of good and evil epistemically—they cleave down the lines of good and evil instrumentally.
It takes different tools to make someone worse off than it does to help them. If you want to make them better epistemically, then you get to use the fact that having good maps helps you get where you want to be.
It takes different tools to make someone worse off than it does to help them.
Worse off by whose definition? Presumably, if you believed that conversion to Christianity makes one better off, you could use the same techniques (with a different set of arguments) to accomplish the goal.
Both, but the statement is stronger for their definition.
My general approach to helping people is to clear out their fears and then let them reassemble the pieces as they see fit—sometimes suggesting possible solutions. This is more easily used to help people than to hurt them, since they are in full control of their actions and more of the game space is visible to them. I can fool them into thinking they’re helping themselves, but I’d have to include at least selective fear removal (though this can happen accidentally through your own biases!).
In contrast, using leading questions and classical conditioning works equally well regardless of which direction you’re pushing.
Hmm, have been looking through your blog a bit more… I’m wondering if you can help people complaining about akrasia by making their second-order desires first-order ones?
Hmm, you would probably be great playing the jailed AI in an AI boxing experiment (can you beat [someone like] EY?), but how successful would you be playing the guard?
The AI box game still seems stacked against the AI roleplayer for any similar skill level. As the AI, I don’t think I could beat someone like EY or myself on the other end, and as the gate keeper I think I would beat someone like EY or myself.
I still wouldn’t consider myself secure against even human takeover in general, especially if I’m not prepared for mental assault.
Also, can you write an AI bot that would have a decent success rate against humans, by finding and exploiting the standard wetware bugs?
Not for any interesting opponent. I can’t even write a ‘real’ chatbot. The only reason I get the results I do is because I immediately force them into a binary yes/no response and then make sure they keep saying yes .
Something here feels off: I’d call the parent a pretty strong claim, effectively “I can cure akrasia (sorta) in the majority of people who ask”. I would have expected someone to have tested this, and reported their results; if positive, I would have expected this to be the sort of thing I would have noticed much sooner than a year and two months later. (In fact, around the time this was posted, I had started reading LessWrong and I’d received an email entitled “Jim’s hypnotherapy” that I ignored for some months).
Basically, my first reaction to this was “Why ain’t ya rich?”
Having said that, I want to build up the courage to PM you for a test, if you’re still doing so; if you’re half as powerful as you claim, then of course I want to benefit from that. ;p
(I’ve been reading your blog and wound up finding this because I typed “hypnotism” into the LW search box.)
effectively “I can cure akrasia (sorta) in the majority of people who ask”.
I wouldn’t quite say that. I meant “yes, akrasia is fixable in this way”. Less “I’m a wizard!” and more “Yes, there’s a solution, it looks like that, so have fun solving the puzzle”
To make a personal claim of competence, I’d have to add some qualifiers. Maybe something like “I expect to be able to cure akrasia (sorta) in the majority of people that commit to solving it with me”, which is a much stricter criteria than “asks”. I’d also have to add the caveat that “curing” might, after reflectively equilibriating, end up with them realizing they don’t want to work as hard as they thought—that’s not my call and it wouldn’t surprise me if a significant number of people went that way to some degree.
I would have expected someone to have tested this, and reported their results; if positive, I would have expected this to be the sort of thing I would have noticed much sooner than a year and two months later.
I’m not sure if you mean within LW in particular. I haven’t yet worked magic on any LWer in this context, but I did offer a couple times.
If you’re counting outside LW, hypnotherapists get results all the time—even “amazing” results. Some people are convinced, some people write it off in one way or another. It doesn’t surprise me all that much given how people get with “skepticism” and not wanting to be made fool about hypnotism.
Basically, my first reaction to this was “Why ain’t ya rich?”
Good question.
The first part of the answer is that I have gotten a ton of value out of these skills, and only expect to gain more.
The second part is that it’s not magic. It’s more of a martial art than a cheat code. Even when it appears to be magic, there’s usually more going on in the background that made it possible. The toughest part is all the meta-level bullshit that people carry around about their problems which makes getting them into “lets solve this” mode the hard part. Once you get someone to congruently say “Yes, I’m going to be a good hypnotic subject and we’re going to fix this”, you’ve done 90% of the work—but everyone focuses on the last 10% which looks like magic and then wonders “why not sprinkle this magic pixie dust on everyone!?!”.
Also, getting “rich”—assuming you mean at a level more than charging a couple hundred dollars per hour like many hypnotherapists do—requires you to be good at finding high leverage applications and working your way into them. That’s a whole new subset of skills and I haven’t yet gotten to that stage—though I plan on working on it.
Having said that, I want to build up the courage to PM you for a test, if you’re still doing so; if you’re half as powerful as you claim, then of course I want to benefit from that. ;p
First of all, I don’t like this “if you’re half as powerful as you claim” thing—especially since you seem to have read it as stronger than intended. When I make “strong claims” I do not expect, in the social obligation sense, to be believed. I’m just trying to be understood—that really is how I honestly see things. Take with as much salt as you please.
It’s important to make this explicit because setting up high expectations for a hoping skeptic is a sure way to fail—It sets up a dynamic of me being responsible for their behavior. While I do take responsibility for their actions internally, the only real way I can do this is through making sure they take responsibility for their own actions.
Also, there should be no courage needed. I can’t just take over your mind. With you, I’m not sure whether I’d pull out hypnosis at all, and (almost) certainly can’t just get into it from the start. Also, I can only push you as hard as you let yourself be pushed. Let’s chat some time and see where it goes.
Your link to your blog is down, but once its back up and if I find this claim plausible upon reading it, I would be very interested in trying this on myself.
You’re welcome to try and break my atheism, but I’m saying that only because I’m reasonably darned sure you can’t do that by any conversational means (so long as we’re actually in a universe that doesn’t have a God, of course, I’m not stating a blind belief known to me to be blind).
Edit: oh, wait, didn’t realize you were using actual hypnotism rather than conversation. Permission retracted; I don’t know enough about how that works.
I’m reasonably darned sure you can’t do that by any conversational means
Agreed. The only way I’d see myself as having a fighting chance would be if you had a strong reason to go into hypnosis and you didn’t know my intentions.
If the world really were at stake, I think I could help you with the red panda problem—though I still have fairly wide confidence intervals on how difficult that would be because I haven’t tried something like this. I have yet to find a real life example where I’d encourage self deception and a surprisingly large fraction of problems go away when you remove the self deception.
I have been having a lot of fun using hypnosis and techniques inspired by hypnosis to improve rationality—and successfully. I was a bit disappointed that you didn’t respond to my email offering to show what hypnosis says about training rationality. And now I’m a bit confused with the retraction because I had figured you had completely written me off as a crackpot.
Will Ryan mentioned that you were skeptical of “this stuff”. Can you elaborate on what specifically you’re skeptical about and what kinds of evidence you’d like to see?
I hope you don’t think you are actually “giving amnesia” or doing anything other than roleplaying mind-controller and mind-controllee, in dialogues like these. Those teenagers are just playing along for their own reasons.
I certainly don’t mean to say that I have any kind of fully-general way to convert theists. I mean rather to say that as you get closer to individual people, you find out what particular levers they have to flip and buttons they have to push, and that with sufficient familiarity the sequence of just-the-right-things-to-say-and-do becomes clear. But if you would like an example of what I’d say to a specific person (currently there are three to whom I know what I would say), I can do that. Let me know.
But if you would like an example of what I’d say to a specific person (currently there are three to whom I know what I would say), I can do that. Let me know.
Yes, this sounds very intriguing. So, you have a model of their thinking good enough to predict how such conversation would go? Would you be willing to describe it here and then try it IRL (if you deem it appropriate) and report what happened?
I’m going to describe such a conversation (the first of what would, I think, be many) for a girl who I will call Jane, though that is not her name.
Some background: Jane is a devout Catholic, an altar girl, a theology major, a performer of the singing-acting-dancing type, and one of the bubbliest people I know. She is also firmly against gay marriage, abortion, premarital sex, and consumption of alcohol or other drugs (though for some reason she has no problem with consumption of shellfish). You may have read the previous two sentences and thought “there’s a lot of sexual repression going on there” and you would be quite correct, though she would never admit that. Here is what I would say and do. Don’t take the wording too literally; I’m not that good.
tld: (At an appropriate moment) Jane, I have a very personal question for you.
J: Okay, shoot.
tld: It’s about God.
J: Oh dear. I’m listening.
tld: So God exists. And he’s up there, somewhere, shouting down that he loves us. But if tomorrow morning he suddenly vanished—just ceased to exist, packed up and left town, whatever—would you want to know?
J: I—uh—gosh. That would go against everything God’s said, about how he would never abandon us-
tld: I know. But just think of it as a counterfactual question. God leaves, or vanishes. Do you want to know?
J: I don’t know. It’s—I just can’t imagine that happening.
tld: taking Jane’s hand, gentle smile Hey. Don’t let it rattle you. Just remember, here in the real world, God’s up there somewhere, and he loves us, and he would never abandon us.
J: I love hearing you say that.
tld: Sure. So in the real world, nothing to worry about. But over there in the imaginary, fake world—God vanishes. Would you want to know?
J: Well… I guess so. Because otherwise it’s just living a lie, isn’t it?
tld: Right. squeeze hand softly I’m glad you agree, it’s very brave and honest of you to be able to say that. So the follow-up question is, what would change, in that world?
J: What do you mean?
tld: Well, God was there, and now he’s left that world behind. So it’s a world without God—what changes, what would be different about the world if God weren’t in it?
J: I can’t imagine a world without God in it.
tld: Well, let’s look at it the other way, then. Let’s imagine another world, just like the first two except that it never had a God in the first place, and then God shows up. He came from the other world, the first one we imagined, to give this new world some of His light, right? reassuring squeeze
J: squeeze back Okay...
tld: So God comes into this new world, and the first thing he does is make it a better place, right? That’s what God does, he makes the world a better place.
J: Yeah! Yeah, exactly. God makes the world a better place.
tld: So God comes down himself, or sends down His son, and feeds the poor and heals the sick, and pretty soon the world is better off because God is there.
J: Of course.
tld: Great! smile So let’s think about the other world, the one that got left behind, for a second. What would you do, if you were there?
J: What? (shocked)
tld: Well, the you in the other world finds out there’s no God anymore, and that’s that. So what would you do? lean in, squeeze hand again There must be some things you’d dare to do that you wouldn’t otherwise.
J: pause, blush Um. Well. I don’t know. I’d have to think about it.
tld: Right, it’s a hard question. final hand squeeze, lean back But I hope you’ll think about it, for the next time we talk, and let me know what you’ve come up with. I’ve actually got to run, it’s getting kind of late (or other excuse for why I need to leave, etc)
Proceed to wait until she brings the subject up again, or bring it up again later myself.
So, yes. The above conversation has two purposes, which are (a) to plant the idea of dealing with a world where God doesn’t exist, and (b) to remind Jane that there are things she wants but can’t have because of her faith so that she has a reason, though unspoken, to want to be rid of it; there are a couple of other things going on as well which I’m sure faul_sname will cringe at, but that’s the gist. Intended arc of development: A few months’ worth of working on a truth-seeking mindset, possibly more work on building rapport and position-of-authority mojo, and eventually the Jenga moment, which it’s difficult to plan out precisely in advance.
And yes, I realize that playing on sexual tension to manipulate someone’s beliefs is, in a word, disgusting. I did say Dark Arts for a reason.
The other two people who’ve been weighing on my mind are let’s-call-him-James and let’s-call-her-Mary, for whom the intended sequence is a little different (neither of them has an easily-accessible repressed-sexuality motivator) but you get the idea, I think.
What’s unreasonable about Chick tracts, I think, is that strangers can’t really walk up and manipulate you like that unless you’re already in an extremely emotionally vulnerable state. It’s easier if there’s an established relationship.
Unless J is much, much less intelligent than you, or you’ve spent a lot of time planning different scenarios, it seems like any one of J’s answers might well require too much thought for a quick response. For example,
tld: Well, God was there, and now he's left that world behind. So it's a world without God - what changes, what would be different about the world if God weren't in it?
J: I can't imagine a world without God in it.
Lots of theists might answer this in a much more specific fashion. “Well, I suppose the world would cease to exist, wouldn’t it?”, “Anything could happen, since God wouldn’t be holding it together anymore!”, or “People would all turn evil immediately, since God is the source of conscience.” all seem like plausible responses. “I can’t imagine a world without God in it” might literally be true, but even if it is, J’s response might be something entirely different, or even something that isn’t really even a response to the question (try writing down a real-life conversation some time, without cleaning it up into what was really meant. People you know probably very often say things that are both surprising and utterly pointless).
I didn’t even go to Catholic school, but in the process of Confirmation I learned enough apologetics to deflect or reject or just willfully not understand most of these.
A Good Catholic will tell you that the universe could not exist without God, and/or that nothing good can exist without God, so if there were no God, there would either be no universe, or the universe would be hell.
It would sort of be like me trying to convince you quantum physics is wrong and starting out by saying, “Imagine a world without quantum physics.” You have nothing with which to substitute quantum physics. Your mind returns a divide by zero error.
Additionally, religious folks in general tend to claim to believe that morality comes from God. And when they say this, they really truly mean that if there were no God, there would be no morality. That the fact that morality exists is a kind of proof that God exists. I am not making this up. I have been told by a religious person that, if they were to learn that God did not exist, they would immediately embark upon an orgy of murder and theft, because, “There would be no reason not to.” They believe this about themselves despite the fact that we know it to be a misunderstanding of psychology. I am not saying all religious people have exactly this glitch, but I am trying to emphasize that your friend(s) probably don’t have the cognitive algorithms in place to even comprehend these questions the way you mean them.
With respect to the fact that for most of its history humanity didn’t know about quantum physics, as well as for larger part of my life I didn’t know anything substantial about quantum physics without suffering any serious injury to my imagination, this would be quite easy.
Just a nitpick, I mostly agree with the rest of your comment.
I have been told by a religious person that, if they were to learn that God did not exist, they would immediately embark upon an orgy of murder and theft, because, “There would be no reason not to.” They believe this about themselves despite the fact that we know it to be a misunderstanding of psychology.
To avoid a typical mind fallacy, let’s say that some people really have no non-supernatural reason to avoid murder and theft. But they are in a minority, so there is a high prior probability that the given religious person does not belong there.
However, I would love to know that for the given nonzero subset of humanity that has no non-supernatural reason to avoid murder and theft, how effective religion really is at stopping them.
I… Was not even aware that such a game existed; I was referring to The Once And Future King. But clicking through the wiki a little bit has me fascinated by the tangle of mythological references.
That’s pretty good. Of course, there are a few places in this conversation where Jane might deviate from the script, but you know her and I don’t. Were I devout enough, I’d say “It’s a sin to even imagine the world without God” or “There is only one world, so no point imagining anything else”, or “The Bible teaches us that …” But maybe your gentle hand squeezes redirected the blood flow from her brain to other areas.
Anyway, if you decide to go for it, I’m dying to know how it works out!
Give her all the comments from here (or point her to your post here), saying it’s you (I checked that your past posting offers no other reason for avoiding this). If your influence/friendship/etc with her is not destroyed by the truth, you may carry on.
Dumbest line in your post: “though for some reason she has no problem with consumption of shellfish”
Serious question: If Gwern had access to personal info on you in a professional capacity (e.g., private e-mails as Sys Admin or some such), would you trust him not to misuse it? (as you would define “misuse”, and he might not)
TLD, here is my conclusion to your story.
J, after reading this exchange: How could he think that about me? I would never think that way about him. This really hurts (tearing up). Is this really what people think about me?
Explicitly declaring “I am going to try to convert you” to any of these people would definitely eliminate or minimize all potential avenues of influence, and I do not think I am nearly subtle enough to work around that. Still, if I understand what you’re saying correctly, it’s more an issue of informed consent of study participants than of letting people decide whether they want their buttons pushed. Is that an accurate understanding of your perspective?
Not really, although it’s a more careful reading than I expected. I think that would be a distinction without a difference. No, as with Gwern, I think the main issue here is you. What sort of person is Gwern training himself to be?
Like Gwern, you act like you’re conducting a study on someone, but it’s really just two people talking. Pretend, for a moment, the other person is actually much smarter than you and conducting a test of the exact same principle you are testing. In Gwern’s case, that leads to a much more interesting interpretation of the incident, since he’s clearly horribly biased (the test really does have a result). In your case, you’re not at all truth-seeking. I would advise you seek to truth in your relationship with J first (either by self-modification or greater honesty of the unmodified)
Here’s my frivolous question: How old are you and how old is J? (you can make it approximate if you think it would reveal personal info).
Both twenty-one. But that is a less useful statistic than emotional maturity, which I think is what you’re getting at, so I should note that there is a definite discrepancy in terms of how well we handle feelings—I have a great deal more emotional control than does she. So despite being the same age, there is a power imbalance in a sense similar to the one you’re asking about. Of the two undescribed parties, one is older than me (22) and one is younger (19).
Actually, I don’t quite have to pretend that the other parties are attempting manipulation in the other direction; they’ve all been fairly transparent in their attempts (albeit with varying degrees of persistence; of the three, J sits in the middle in terms of time spent attempting to convert me).
No, the pretense is not that they’re trying to manipulate you in the other direction, but that they’re manipulating your manipulation. That is, Gwern was being tested on his fairness as a experimenter of fairness. You are being tested on your truth-seeking as an experimenter in truth-seeking. Of course, you are, just not by J.
I had two reasons for asking about age (you’re right on one). Your narrative sounded pretty juvenile even in its self-description. I was hoping that was true (for both your sakes).
Here’s another game for you to play: Your brain learns whereof you know not. What general rules is it learning as you interact with J? Someday, if you’re luck enough, you can plan on being quite slow. The virtues you currently rely on (roughly: quick-witted) will have left you. You should be investing as quickly as you can in cultivating other personal virtues. Don’t plan on the world changing enough that that can be avoided. I can’t seem to avoid a patronizing attitude (bad sign for me, similarly: I’m out).
Not really. I listed some reasons elsewhere, but they’re pretty arbitrary (which was more or less the point). Also, not sockpuppets in the conventional sense since clearly not disguised and I will never count backwards.
Then please stop; this gives you the power to vote ten times on the same post, and whether or not you use that power, it damages trust in the karma system.
It’s funny to refer to something as a “power” when its an extra 10 seconds work which anybody could have already engaged in without advertising as blatantly as I have. My advertising has also been false.
The blatant advertising is the problem—openly flouting a social norm weakens it (also, what you’re doing is a cheap way of attracting attention, as opposed to saying worthwhile things).
I’m not sure I agree. I think my behavior, even if treated favorably by the community, will likely not weaken the norm against multi-voting. Karma seem a much less useful signal here than in communities where the prohibitions against “near” behavior are less strict. That’s just from observation, although I think an argument could be made that if a signal really is easy to counterfeit, it’s probably less counterfeited when that fact is generally known (no easy opinion arbitrage). But certainly not worth arguing.
You make an interesting point. To be sure I’ve understood: Behave in a more truth-seeking manner in general, because if I do so I will be a more truth-seeking person in the future from force of habit, and if I do not do so then I will be less of one? If the force of habit is really so potent in cases like this then it’s a very convincing argument; I wouldn’t want to give up the ability to be rational just to be a tiny bit better at manipulation.
Yup. I think “force of habit” undersells it, except to the extent you are a collection of habits. Plus trying to encourage truth-seeking as opposed to truth-labeling as a goal. That is, the phrase you like is “We often say, here, that that which can be destroyed by the truth should be”
But you’re not destroying her belief by the truth, you’re destroying a belief and replacing it with the truth (ish). At least, as you describe yourself. Other stuff (that is, I think this is one of dozens of arguments for why this way of thinking is foolish: more interesting to me is the degree to which the sensible upvoted comments on this page—be nicer and more respectful—lack explication or mechanism).
Absolutely, contingent on being able to convince myself it’s ethical to do so. Give me a moment to do some typing and I’ll outline how I think one such conversation sequence would go.
I don’t personally think the Sequences count as Dark Arts, since I don’t think EY was trying to employ them. At the same time, they were written by someone who very definitely assumed the social role of the wise and informed guru, who used humor, and all sorts of excellent rhetorical principles to make his points as persuasive as possible. If someone were to deliberately use those techniques in order to persuade someone of something because rational reasons wouldn’t work, then I would call it Dark Arts.
This seems like an extremely strong statement and thus hardly believable. Many people would dearly love to discover such powerful secrets. Feel free to share an example or two.
It’s very believable. I’ll give a couple of techniques here.
Reinforce skeptical behavior while modifying their self-image to that of a truth-seeker: “I love talking to you because you pursue the truth over comforting lies”. Be genuine, and by that I mean use the tone you would use to tell someone that their suit looks good.
Give high-status cues. Assume the role of teacher or mentor. Once they want to become more like you, merely expressing your beliefs (not opposing theirs, but expressing yours) will make a significant impact.
Demonstrate that giving up religious belief won’t result in isolation. How exactly you do this will vary based on the social context.
There are a couple others, such as generating low-status associations with religion, which is a bit advanced and so not worth covering here, and creating false memories and comittments, which is scarily easy to do but absolutely dark arts and therefore not covered here.
You will note the absence of “rational argument” on this list. That’s because rational argument is rather ineffective for changing the mind of the person you are arguing with (though it may change the views of observers).
And what is your success rate using these conversion techniques?
3 successes (that I know of) out of 1 attempted. I don’t intentionally deconvert people, generally speaking.
hmm, 300% success rate...
Did you succeed with the one you attempted?
Yes. To be honest, I suspect I only hastened the process by a year or two though (also ended up giving a crash course in evolutionary bio and physics, which helped me understand both subjects much better).
I have recently had the unpleasant experience of getting subjected to the kind of dishonest emotional manipulation that is recommended here. A (former) friend tried to convert me to his religion by using these tricks, and I can attest that they are effective if the person on the receiving end is trusting enough and doesn’t realize that they are being manipulated. In my case the absence and avoidance of rational argument eventually led to the failure of the conversion attempt, but not before I had been inflicted severe emotional distress by a person I used to trust.
Needless to say, I find it unpleasant that these kind of techniques are mentioned without also mentioning that they are indeed manipulative, dishonest and very easy to abuse.
Does LessWrong have an actual primer on the Dark Arts anywhere? There’s a lot of discussion of Defense Against, but I haven’t seen any Practice Of… Perhaps that’s beyond the scope of what we really intend to teach here?
There are several started sequences, none of which got past their fist post.
So any given Practice of the Dark Arts teacher can only last for one term? :)
The last of those 3 (mine), at least, is in the process of being developed. I’m still mostly focusing on reading the relevant literature. I have the rough draft of 3 posts, but since it looks like there will be 10 to 15 of them plus a large post of miscellaneous techniques of influence, I am not posting yet (I will probably reorganize before I post).
I would be interested in knowing what resources you used for this sequence.
As an autist there is a huge swath of innate skills ‘normal’ people possess I can only emulate. Social success for me is indistinguishable from Dark Art skill.
To start with, I would recommend (in the following order)
Thinking Fast and Slow- Kahneman and Tversky Influence: Science and Practice—Cialdini How We Decide- Lehrer How to Win Friends and Influence People—Carnegie Nudge: Thaler and Sunstein
Cialdini and Carnegie have a bad habit of not citing sources, so you may want to take any unsubstantiated claims with a grain of salt.
This list is not comprehensive. If anyone else would like to add some recommendations for books or particularly informative studies, I would definitely appreciate it.
In addition to reading, experience in dealing with people is very important for things like this. If you are not currently employed, I would recommend getting a job in sales. This will give you a chance to practice and experiment in a relatively safe environment. Additionally, I have heard that unusual behavior is more accepted in bars, so that might be worth looking into (I’m under 21 and live in America, so that is not really an option for me. As such, bear in mind that this is secondhand advice). Finally, if you are particularly skilled in some subject area, you may want to consider tutoring. In addition to bringing in money and helping someone else, this will allow you to experience being in a high-status situation.
Once again, the list of recommended experiences is not comprehensive. I would welcome any additional suggestions.
Adding
\s\s
before your\n
will let you do newlines in Markup syntax.Thank you.
Both require powers, the second involves using them unethically.
I look forward very much to seeing your sequence.
Discussion in the comments of this post, in which I perceived Luke as heartily recommending skinny-dipping in sewers for self-improvement purposes. “And then I swallowed this sample of engineered resistant mycobacterium tuberculosis, and I felt great! Not that you should do that or anything.”
“each theist” is the part of the claim that is too strong, since it would include, among others, the Pope, Mother Teresa and Osama bin Ladin. I grant that some techniques do work on some theists (and atheists).
True, if you want to be pedantic about it. In fact, they probably wouldn’t work on most theists in high-status positions. Think about how often you hear about someone “finding God/Allah/Jesus” at a low point of their life when they feel themselves to be failures. Now consider how often someone high-status changes their beliefs.
Didn’t see this! You’re right, that is quite a bit too strong. Let me reduce the strength of that statement: Among theists to whom I have become close enough to ask deeply personal questions and expect truthful answers, such levers seem prevalent.
Really horrifyingly scarily easy. (Most of the comments thread.)
Here are several.
Interesting, the guy must be a very good hypnotist. I’m wondering if he can convert people, as well as deconvert?
It’s my blog. I think I can for the large fraction of atheists that got there by social pressure alone (at least for a month or so), but people that actually understand why atheism is the right answer would be tougher. I’m curious if I could break them too, but that’s way too evil for my tastes.
The techniques don’t cleave down the lines of good and evil epistemically—they cleave down the lines of good and evil instrumentally.
It takes different tools to make someone worse off than it does to help them. If you want to make them better epistemically, then you get to use the fact that having good maps helps you get where you want to be.
Worse off by whose definition? Presumably, if you believed that conversion to Christianity makes one better off, you could use the same techniques (with a different set of arguments) to accomplish the goal.
Both, but the statement is stronger for their definition.
My general approach to helping people is to clear out their fears and then let them reassemble the pieces as they see fit—sometimes suggesting possible solutions. This is more easily used to help people than to hurt them, since they are in full control of their actions and more of the game space is visible to them. I can fool them into thinking they’re helping themselves, but I’d have to include at least selective fear removal (though this can happen accidentally through your own biases!).
In contrast, using leading questions and classical conditioning works equally well regardless of which direction you’re pushing.
Hmm, have been looking through your blog a bit more… I’m wondering if you can help people complaining about akrasia by making their second-order desires first-order ones?
Yep :)
Hmm, you would probably be great playing the jailed AI in an AI boxing experiment (can you beat [someone like] EY?), but how successful would you be playing the guard?
The AI box game still seems stacked against the AI roleplayer for any similar skill level. As the AI, I don’t think I could beat someone like EY or myself on the other end, and as the gate keeper I think I would beat someone like EY or myself.
I still wouldn’t consider myself secure against even human takeover in general, especially if I’m not prepared for mental assault.
Would you know what to look for?
Also, can you write an AI bot that would have a decent success rate against humans, by finding and exploiting the standard wetware bugs?
For the most part
Not for any interesting opponent. I can’t even write a ‘real’ chatbot. The only reason I get the results I do is because I immediately force them into a binary yes/no response and then make sure they keep saying yes .
Something here feels off: I’d call the parent a pretty strong claim, effectively “I can cure akrasia (sorta) in the majority of people who ask”. I would have expected someone to have tested this, and reported their results; if positive, I would have expected this to be the sort of thing I would have noticed much sooner than a year and two months later. (In fact, around the time this was posted, I had started reading LessWrong and I’d received an email entitled “Jim’s hypnotherapy” that I ignored for some months).
Basically, my first reaction to this was “Why ain’t ya rich?”
Having said that, I want to build up the courage to PM you for a test, if you’re still doing so; if you’re half as powerful as you claim, then of course I want to benefit from that. ;p
(I’ve been reading your blog and wound up finding this because I typed “hypnotism” into the LW search box.)
I wouldn’t quite say that. I meant “yes, akrasia is fixable in this way”. Less “I’m a wizard!” and more “Yes, there’s a solution, it looks like that, so have fun solving the puzzle”
To make a personal claim of competence, I’d have to add some qualifiers. Maybe something like “I expect to be able to cure akrasia (sorta) in the majority of people that commit to solving it with me”, which is a much stricter criteria than “asks”. I’d also have to add the caveat that “curing” might, after reflectively equilibriating, end up with them realizing they don’t want to work as hard as they thought—that’s not my call and it wouldn’t surprise me if a significant number of people went that way to some degree.
I’m not sure if you mean within LW in particular. I haven’t yet worked magic on any LWer in this context, but I did offer a couple times.
If you’re counting outside LW, hypnotherapists get results all the time—even “amazing” results. Some people are convinced, some people write it off in one way or another. It doesn’t surprise me all that much given how people get with “skepticism” and not wanting to be made fool about hypnotism.
Good question.
The first part of the answer is that I have gotten a ton of value out of these skills, and only expect to gain more.
The second part is that it’s not magic. It’s more of a martial art than a cheat code. Even when it appears to be magic, there’s usually more going on in the background that made it possible. The toughest part is all the meta-level bullshit that people carry around about their problems which makes getting them into “lets solve this” mode the hard part. Once you get someone to congruently say “Yes, I’m going to be a good hypnotic subject and we’re going to fix this”, you’ve done 90% of the work—but everyone focuses on the last 10% which looks like magic and then wonders “why not sprinkle this magic pixie dust on everyone!?!”.
Also, getting “rich”—assuming you mean at a level more than charging a couple hundred dollars per hour like many hypnotherapists do—requires you to be good at finding high leverage applications and working your way into them. That’s a whole new subset of skills and I haven’t yet gotten to that stage—though I plan on working on it.
First of all, I don’t like this “if you’re half as powerful as you claim” thing—especially since you seem to have read it as stronger than intended. When I make “strong claims” I do not expect, in the social obligation sense, to be believed. I’m just trying to be understood—that really is how I honestly see things. Take with as much salt as you please.
It’s important to make this explicit because setting up high expectations for a hoping skeptic is a sure way to fail—It sets up a dynamic of me being responsible for their behavior. While I do take responsibility for their actions internally, the only real way I can do this is through making sure they take responsibility for their own actions.
Also, there should be no courage needed. I can’t just take over your mind. With you, I’m not sure whether I’d pull out hypnosis at all, and (almost) certainly can’t just get into it from the start. Also, I can only push you as hard as you let yourself be pushed. Let’s chat some time and see where it goes.
Your link to your blog is down, but once its back up and if I find this claim plausible upon reading it, I would be very interested in trying this on myself.
EDIT: read the blog, and it looks awesome.
You’re welcome to try and break my atheism, but I’m saying that only because I’m reasonably darned sure you can’t do that by any conversational means (so long as we’re actually in a universe that doesn’t have a God, of course, I’m not stating a blind belief known to me to be blind).
Edit: oh, wait, didn’t realize you were using actual hypnotism rather than conversation. Permission retracted; I don’t know enough about how that works.
Agreed. The only way I’d see myself as having a fighting chance would be if you had a strong reason to go into hypnosis and you didn’t know my intentions.
If the world really were at stake, I think I could help you with the red panda problem—though I still have fairly wide confidence intervals on how difficult that would be because I haven’t tried something like this. I have yet to find a real life example where I’d encourage self deception and a surprisingly large fraction of problems go away when you remove the self deception.
I have been having a lot of fun using hypnosis and techniques inspired by hypnosis to improve rationality—and successfully. I was a bit disappointed that you didn’t respond to my email offering to show what hypnosis says about training rationality. And now I’m a bit confused with the retraction because I had figured you had completely written me off as a crackpot.
Will Ryan mentioned that you were skeptical of “this stuff”. Can you elaborate on what specifically you’re skeptical about and what kinds of evidence you’d like to see?
I hope you don’t think you are actually “giving amnesia” or doing anything other than roleplaying mind-controller and mind-controllee, in dialogues like these. Those teenagers are just playing along for their own reasons.
That hypothesis certainly isn’t new to me.
There’s a lot of research on hypnotic amnesia. Here are a few showing differences between hypnotically suggested amnesia and faked amnesia.
http://psycnet.apa.org/journals/abn/70/2/123/
http://www.ncbi.nlm.nih.gov/pubmed/2348012
http://psycnet.apa.org/journals/abn/105/3/381/
The relationship between “actually giving amnesia” and “roleplaying amnesia” is fascinating, but not something I’m going to get into here.
I certainly don’t mean to say that I have any kind of fully-general way to convert theists. I mean rather to say that as you get closer to individual people, you find out what particular levers they have to flip and buttons they have to push, and that with sufficient familiarity the sequence of just-the-right-things-to-say-and-do becomes clear. But if you would like an example of what I’d say to a specific person (currently there are three to whom I know what I would say), I can do that. Let me know.
Yes, this sounds very intriguing. So, you have a model of their thinking good enough to predict how such conversation would go? Would you be willing to describe it here and then try it IRL (if you deem it appropriate) and report what happened?
I’m going to describe such a conversation (the first of what would, I think, be many) for a girl who I will call Jane, though that is not her name. Some background: Jane is a devout Catholic, an altar girl, a theology major, a performer of the singing-acting-dancing type, and one of the bubbliest people I know. She is also firmly against gay marriage, abortion, premarital sex, and consumption of alcohol or other drugs (though for some reason she has no problem with consumption of shellfish). You may have read the previous two sentences and thought “there’s a lot of sexual repression going on there” and you would be quite correct, though she would never admit that. Here is what I would say and do. Don’t take the wording too literally; I’m not that good.
tld: (At an appropriate moment) Jane, I have a very personal question for you.
J: Okay, shoot.
tld: It’s about God.
J: Oh dear. I’m listening.
tld: So God exists. And he’s up there, somewhere, shouting down that he loves us. But if tomorrow morning he suddenly vanished—just ceased to exist, packed up and left town, whatever—would you want to know?
J: I—uh—gosh. That would go against everything God’s said, about how he would never abandon us- tld: I know. But just think of it as a counterfactual question. God leaves, or vanishes. Do you want to know? J: I don’t know. It’s—I just can’t imagine that happening.
tld: taking Jane’s hand, gentle smile Hey. Don’t let it rattle you. Just remember, here in the real world, God’s up there somewhere, and he loves us, and he would never abandon us.
J: I love hearing you say that.
tld: Sure. So in the real world, nothing to worry about. But over there in the imaginary, fake world—God vanishes. Would you want to know?
J: Well… I guess so. Because otherwise it’s just living a lie, isn’t it?
tld: Right. squeeze hand softly I’m glad you agree, it’s very brave and honest of you to be able to say that. So the follow-up question is, what would change, in that world?
J: What do you mean?
tld: Well, God was there, and now he’s left that world behind. So it’s a world without God—what changes, what would be different about the world if God weren’t in it?
J: I can’t imagine a world without God in it.
tld: Well, let’s look at it the other way, then. Let’s imagine another world, just like the first two except that it never had a God in the first place, and then God shows up. He came from the other world, the first one we imagined, to give this new world some of His light, right? reassuring squeeze
J: squeeze back Okay...
tld: So God comes into this new world, and the first thing he does is make it a better place, right? That’s what God does, he makes the world a better place.
J: Yeah! Yeah, exactly. God makes the world a better place.
tld: So God comes down himself, or sends down His son, and feeds the poor and heals the sick, and pretty soon the world is better off because God is there.
J: Of course.
tld: Great! smile So let’s think about the other world, the one that got left behind, for a second. What would you do, if you were there?
J: What? (shocked)
tld: Well, the you in the other world finds out there’s no God anymore, and that’s that. So what would you do? lean in, squeeze hand again There must be some things you’d dare to do that you wouldn’t otherwise.
J: pause, blush Um. Well. I don’t know. I’d have to think about it.
tld: Right, it’s a hard question. final hand squeeze, lean back But I hope you’ll think about it, for the next time we talk, and let me know what you’ve come up with. I’ve actually got to run, it’s getting kind of late (or other excuse for why I need to leave, etc)
Proceed to wait until she brings the subject up again, or bring it up again later myself.
So, yes. The above conversation has two purposes, which are (a) to plant the idea of dealing with a world where God doesn’t exist, and (b) to remind Jane that there are things she wants but can’t have because of her faith so that she has a reason, though unspoken, to want to be rid of it; there are a couple of other things going on as well which I’m sure faul_sname will cringe at, but that’s the gist. Intended arc of development: A few months’ worth of working on a truth-seeking mindset, possibly more work on building rapport and position-of-authority mojo, and eventually the Jenga moment, which it’s difficult to plan out precisely in advance. And yes, I realize that playing on sexual tension to manipulate someone’s beliefs is, in a word, disgusting. I did say Dark Arts for a reason.
The other two people who’ve been weighing on my mind are let’s-call-him-James and let’s-call-her-Mary, for whom the intended sequence is a little different (neither of them has an easily-accessible repressed-sexuality motivator) but you get the idea, I think.
This.. reads to me like a Chick tract more than anything else. I just don’t believe J will be that easy to manipulate.
What’s unreasonable about Chick tracts, I think, is that strangers can’t really walk up and manipulate you like that unless you’re already in an extremely emotionally vulnerable state. It’s easier if there’s an established relationship.
Unless J is much, much less intelligent than you, or you’ve spent a lot of time planning different scenarios, it seems like any one of J’s answers might well require too much thought for a quick response. For example,
Lots of theists might answer this in a much more specific fashion. “Well, I suppose the world would cease to exist, wouldn’t it?”, “Anything could happen, since God wouldn’t be holding it together anymore!”, or “People would all turn evil immediately, since God is the source of conscience.” all seem like plausible responses. “I can’t imagine a world without God in it” might literally be true, but even if it is, J’s response might be something entirely different, or even something that isn’t really even a response to the question (try writing down a real-life conversation some time, without cleaning it up into what was really meant. People you know probably very often say things that are both surprising and utterly pointless).
I didn’t even go to Catholic school, but in the process of Confirmation I learned enough apologetics to deflect or reject or just willfully not understand most of these.
A Good Catholic will tell you that the universe could not exist without God, and/or that nothing good can exist without God, so if there were no God, there would either be no universe, or the universe would be hell.
It would sort of be like me trying to convince you quantum physics is wrong and starting out by saying, “Imagine a world without quantum physics.” You have nothing with which to substitute quantum physics. Your mind returns a divide by zero error.
Additionally, religious folks in general tend to claim to believe that morality comes from God. And when they say this, they really truly mean that if there were no God, there would be no morality. That the fact that morality exists is a kind of proof that God exists. I am not making this up. I have been told by a religious person that, if they were to learn that God did not exist, they would immediately embark upon an orgy of murder and theft, because, “There would be no reason not to.” They believe this about themselves despite the fact that we know it to be a misunderstanding of psychology. I am not saying all religious people have exactly this glitch, but I am trying to emphasize that your friend(s) probably don’t have the cognitive algorithms in place to even comprehend these questions the way you mean them.
With respect to the fact that for most of its history humanity didn’t know about quantum physics, as well as for larger part of my life I didn’t know anything substantial about quantum physics without suffering any serious injury to my imagination, this would be quite easy.
Just a nitpick, I mostly agree with the rest of your comment.
To avoid a typical mind fallacy, let’s say that some people really have no non-supernatural reason to avoid murder and theft. But they are in a minority, so there is a high prior probability that the given religious person does not belong there.
However, I would love to know that for the given nonzero subset of humanity that has no non-supernatural reason to avoid murder and theft, how effective religion really is at stopping them.
No, this is a perfect example of belief in belief without actual belief.
Wow. You’re, like, literally the Devil.
I mean that in a nonjudgmental way.
When I first read this, I thought “I could do that, that’s easy! … if I had no ethics whatsoever and didn’t care about true from false.”
Just out of curiosity, do you have the obvious ulterior motive here?
Yes. Which is a very good reason for me not to trust my inclinations.
I certainly wouldn’t be nearly as ethical in your place
Just call me le Chevalier mal Fet.
Do you get his Noble Phantasm? “Knight of Honor” is potentially one of the most powerful hougu in Fate/Zero.
I… Was not even aware that such a game existed; I was referring to The Once And Future King. But clicking through the wiki a little bit has me fascinated by the tangle of mythological references.
I have two words for this: planning fallacy.
This is a very valid point, but I’m less interested in whether such a plan is practical than in whether, assuming feasibility, it is ethical.
That’s pretty good. Of course, there are a few places in this conversation where Jane might deviate from the script, but you know her and I don’t. Were I devout enough, I’d say “It’s a sin to even imagine the world without God” or “There is only one world, so no point imagining anything else”, or “The Bible teaches us that …” But maybe your gentle hand squeezes redirected the blood flow from her brain to other areas.
Anyway, if you decide to go for it, I’m dying to know how it works out!
So am I. I predict a train wreck.
Obvious solution:
Give her all the comments from here (or point her to your post here), saying it’s you (I checked that your past posting offers no other reason for avoiding this). If your influence/friendship/etc with her is not destroyed by the truth, you may carry on.
Dumbest line in your post: “though for some reason she has no problem with consumption of shellfish”
Go back and read Gwen in his experiment. Older posts suggest bias (http://lesswrong.com/lw/bs0/knowledge_value_knowledge_quality_domain/6db0), even ignoring complete stupidity of actual result. Gwern’s been here a while. Gwern expresses potential martyrdom for LessWrongian principles (http://lesswrong.com/lw/c5f/case_study_testing_confirmation_bias/6hw2) to approbation, but then is shocked by even the mildest of pushback (http://lesswrong.com/lw/c5f/case_study_testing_confirmation_bias/6i9i), and reasons like an idiot. The legalistic parsing of “quoting” also moderately disgusting.
Serious question: If Gwern had access to personal info on you in a professional capacity (e.g., private e-mails as Sys Admin or some such), would you trust him not to misuse it? (as you would define “misuse”, and he might not)
TLD, here is my conclusion to your story.
J, after reading this exchange: How could he think that about me? I would never think that way about him. This really hurts (tearing up). Is this really what people think about me?
All truthful, moreso than you. Your interaction with J should be humble, perhaps with a bit of self discovery: http://www.overcomingbias.com/2012/05/what-use-far-truth.html
In any event, as appropriate punishments, I call your behavior Gwernian.
Explicitly declaring “I am going to try to convert you” to any of these people would definitely eliminate or minimize all potential avenues of influence, and I do not think I am nearly subtle enough to work around that. Still, if I understand what you’re saying correctly, it’s more an issue of informed consent of study participants than of letting people decide whether they want their buttons pushed. Is that an accurate understanding of your perspective?
Not really, although it’s a more careful reading than I expected. I think that would be a distinction without a difference. No, as with Gwern, I think the main issue here is you. What sort of person is Gwern training himself to be?
Like Gwern, you act like you’re conducting a study on someone, but it’s really just two people talking. Pretend, for a moment, the other person is actually much smarter than you and conducting a test of the exact same principle you are testing. In Gwern’s case, that leads to a much more interesting interpretation of the incident, since he’s clearly horribly biased (the test really does have a result). In your case, you’re not at all truth-seeking. I would advise you seek to truth in your relationship with J first (either by self-modification or greater honesty of the unmodified)
Here’s my frivolous question: How old are you and how old is J? (you can make it approximate if you think it would reveal personal info).
Both twenty-one. But that is a less useful statistic than emotional maturity, which I think is what you’re getting at, so I should note that there is a definite discrepancy in terms of how well we handle feelings—I have a great deal more emotional control than does she. So despite being the same age, there is a power imbalance in a sense similar to the one you’re asking about. Of the two undescribed parties, one is older than me (22) and one is younger (19).
Actually, I don’t quite have to pretend that the other parties are attempting manipulation in the other direction; they’ve all been fairly transparent in their attempts (albeit with varying degrees of persistence; of the three, J sits in the middle in terms of time spent attempting to convert me).
No, the pretense is not that they’re trying to manipulate you in the other direction, but that they’re manipulating your manipulation. That is, Gwern was being tested on his fairness as a experimenter of fairness. You are being tested on your truth-seeking as an experimenter in truth-seeking. Of course, you are, just not by J.
I had two reasons for asking about age (you’re right on one). Your narrative sounded pretty juvenile even in its self-description. I was hoping that was true (for both your sakes).
Here’s another game for you to play: Your brain learns whereof you know not. What general rules is it learning as you interact with J? Someday, if you’re luck enough, you can plan on being quite slow. The virtues you currently rely on (roughly: quick-witted) will have left you. You should be investing as quickly as you can in cultivating other personal virtues. Don’t plan on the world changing enough that that can be avoided. I can’t seem to avoid a patronizing attitude (bad sign for me, similarly: I’m out).
Is there a reason you’re spawning a horde of sockpuppets?
Not really. I listed some reasons elsewhere, but they’re pretty arbitrary (which was more or less the point). Also, not sockpuppets in the conventional sense since clearly not disguised and I will never count backwards.
Then please stop; this gives you the power to vote ten times on the same post, and whether or not you use that power, it damages trust in the karma system.
It’s funny to refer to something as a “power” when its an extra 10 seconds work which anybody could have already engaged in without advertising as blatantly as I have. My advertising has also been false.
The blatant advertising is the problem—openly flouting a social norm weakens it (also, what you’re doing is a cheap way of attracting attention, as opposed to saying worthwhile things).
I’m not sure I agree. I think my behavior, even if treated favorably by the community, will likely not weaken the norm against multi-voting. Karma seem a much less useful signal here than in communities where the prohibitions against “near” behavior are less strict. That’s just from observation, although I think an argument could be made that if a signal really is easy to counterfeit, it’s probably less counterfeited when that fact is generally known (no easy opinion arbitrage). But certainly not worth arguing.
You make an interesting point. To be sure I’ve understood: Behave in a more truth-seeking manner in general, because if I do so I will be a more truth-seeking person in the future from force of habit, and if I do not do so then I will be less of one? If the force of habit is really so potent in cases like this then it’s a very convincing argument; I wouldn’t want to give up the ability to be rational just to be a tiny bit better at manipulation.
Yup. I think “force of habit” undersells it, except to the extent you are a collection of habits. Plus trying to encourage truth-seeking as opposed to truth-labeling as a goal. That is, the phrase you like is “We often say, here, that that which can be destroyed by the truth should be”
But you’re not destroying her belief by the truth, you’re destroying a belief and replacing it with the truth (ish). At least, as you describe yourself. Other stuff (that is, I think this is one of dozens of arguments for why this way of thinking is foolish: more interesting to me is the degree to which the sensible upvoted comments on this page—be nicer and more respectful—lack explication or mechanism).
Okay. Thank you very much for your insight; I do appreciate it.
Absolutely, contingent on being able to convince myself it’s ethical to do so. Give me a moment to do some typing and I’ll outline how I think one such conversation sequence would go.
I just caught myself rationalizing ways to prove that deconverting them would be the right thing, so that I could see the results of this experiment.
I caught myself doing more or less the same thing (but for substantially eviller reasons), which is why I asked LW in the first place.
thelittledoctor is making no small claim here, but such sequences of levers do exist and have dispelled several people’s faith.
Does your link to the Sequences imply that you consider them all Dark Arts?
I don’t personally think the Sequences count as Dark Arts, since I don’t think EY was trying to employ them. At the same time, they were written by someone who very definitely assumed the social role of the wise and informed guru, who used humor, and all sorts of excellent rhetorical principles to make his points as persuasive as possible. If someone were to deliberately use those techniques in order to persuade someone of something because rational reasons wouldn’t work, then I would call it Dark Arts.