This article is awesome! I’ve been doing this kind of stuff for years with regards to motivation, attitudes, and even religious belief. I’ve used the terminology of “virtualisation” to talk about my thought-processes/thought-rituals in carefully defined compartments that give me access to emotions, attitudes, skills, etc. I would otherwise find difficult. I even have a mental framework I call “metaphor ascendence” to convert false beliefs into virtualised compartments so that they can be carefully dismantled without loss of existing utility. It’s been nearly impossible to explain to other people how I do and think about this, though often you can show them how to do it without explaining. And for me the major in-road was totally a realisation that there exist tasks which are only possible if you believe they are—guess I’ll have to check out The Phantom Tollbooth (I’ve never read it).
This might be a bit of a personal question (feel free to pm or ignore), but have you by any chance done this with religious beliefs? I felt like I got a hint of that between the lines and it would be amazing to find someone else who does this. I’ve come across so many people in my life who threw away a lot of utility when they left religion, never realising how much of it they could keep or convert without sacrificing their integrity. One friend even teasingly calls me the “atheist Jesus” because of how much utility I pumped back into his life just by leveraging his personal religious past. Religion has been under strong selective pressure for a long time, and has accumulated a crapload of algorithmic optimisations that can easily get tossed by their apostates just because they’re described in terms of false beliefs. My line is always, “I would never exterminate a nuisance species without first sequencing its DNA.” You just have to remember that asking the organism about its own DNA is a silly strategy.
Anyways, I could go on for a long time about this, but this article has given me the language to set up a new series I’ve been trying to rework for Less Wrong, along the lines of this, so I better get cracking. But the buzz of finding someone like-minded is an awesome bonus. Thank you so much for posting.
p.s. I have to agree with various other commenters that I wouldn’t use the “dark arts” description myself—mind optimisation is at the heart of legit rationality. But I see how it definitely makes for useful marketing language, so I won’t give you too much of a hard time for it.
To address your other question: I was raised religious, and I learned about compartmentalization by self-observation (my religion was compartmentalized for a couple years before I noticed what I was doing). That said, since becoming an atheist I have never held a compartmentalized religious belief for motivational purposes or otherwise.
To address your postscript: “Dark Arts” was not supposed to mean “bad” or “irrational”, it was supposed to mean “counter-intuitive, surface-level irrational, perhaps costly, but worth the price”.
Strategically manipulating terminal goals and intentionally cultivating false beliefs (with cognitive dissonance as the price) seem to fall pretty squarely in this category. I’m honestly not sure what else people were expecting. Perhaps you could give me an idea of things that squarely qualify as “dark arts” under your definition?
(At a guess, I suppose heavily leveraging taboo tradeoffs and consequentialism may seem “darker” to the layman.)
How about extending the metaphor and calling these techniques “Rituals” (they require a sacrifice, and even though it’s not as “permanent” as in HPMOR, it’s usually dangerous), reserving “Dark” for the arguably-immoral stuff?
“Dark Arts” was not supposed to mean “bad” or “irrational”, it was supposed to mean “counter-intuitive, surface-level irrational, perhaps costly, but worth the price”.
In my understanding “Dark Arts” mean, basically, using deceit. In the social context that implies manipulating others for your own purposes. I don’t think “Dark Arts” is a useful term in the context of self-motivation.
Huh. Personally, I feel that the need for comments such as this one strongly indicate “Dark” subject matter. It’s interesting to get a different perspective. Thanks!
I’m honestly not sure what else people were expecting. Perhaps you could give me an idea of things that squarely qualify as “dark arts” under your definition?
Techniques that have negative side effects for other people. Economics models that recommend that people should defect more because it’s in their self interest are “dark”.
This article is awesome! I’ve been doing this kind of stuff for years with regards to motivation, attitudes, and even religious belief. I’ve used the terminology of “virtualisation” to talk about my thought-processes/thought-rituals in carefully defined compartments that give me access to emotions, attitudes, skills, etc. I would otherwise find difficult. I even have a mental framework I call “metaphor ascendence” to convert false beliefs into virtualised compartments so that they can be carefully dismantled without loss of existing utility. It’s been nearly impossible to explain to other people how I do and think about this, though often you can show them how to do it without explaining. And for me the major in-road was totally a realisation that there exist tasks which are only possible if you believe they are—guess I’ll have to check out The Phantom Tollbooth (I’ve never read it).
This might be a bit of a personal question (feel free to pm or ignore), but have you by any chance done this with religious beliefs? I felt like I got a hint of that between the lines and it would be amazing to find someone else who does this. I’ve come across so many people in my life who threw away a lot of utility when they left religion, never realising how much of it they could keep or convert without sacrificing their integrity. One friend even teasingly calls me the “atheist Jesus” because of how much utility I pumped back into his life just by leveraging his personal religious past. Religion has been under strong selective pressure for a long time, and has accumulated a crapload of algorithmic optimisations that can easily get tossed by their apostates just because they’re described in terms of false beliefs. My line is always, “I would never exterminate a nuisance species without first sequencing its DNA.” You just have to remember that asking the organism about its own DNA is a silly strategy.
Anyways, I could go on for a long time about this, but this article has given me the language to set up a new series I’ve been trying to rework for Less Wrong, along the lines of this, so I better get cracking. But the buzz of finding someone like-minded is an awesome bonus. Thank you so much for posting.
p.s. I have to agree with various other commenters that I wouldn’t use the “dark arts” description myself—mind optimisation is at the heart of legit rationality. But I see how it definitely makes for useful marketing language, so I won’t give you too much of a hard time for it.
To address your other question: I was raised religious, and I learned about compartmentalization by self-observation (my religion was compartmentalized for a couple years before I noticed what I was doing). That said, since becoming an atheist I have never held a compartmentalized religious belief for motivational purposes or otherwise.
Ah well, I had to ask. I know religion is usually the “other team” for us, so I hope I didn’t push any buttons by asking—definitely not my intention.
To address your postscript: “Dark Arts” was not supposed to mean “bad” or “irrational”, it was supposed to mean “counter-intuitive, surface-level irrational, perhaps costly, but worth the price”.
Strategically manipulating terminal goals and intentionally cultivating false beliefs (with cognitive dissonance as the price) seem to fall pretty squarely in this category. I’m honestly not sure what else people were expecting. Perhaps you could give me an idea of things that squarely qualify as “dark arts” under your definition?
(At a guess, I suppose heavily leveraging taboo tradeoffs and consequentialism may seem “darker” to the layman.)
How about extending the metaphor and calling these techniques “Rituals” (they require a sacrifice, and even though it’s not as “permanent” as in HPMOR, it’s usually dangerous), reserving “Dark” for the arguably-immoral stuff?
In my understanding “Dark Arts” mean, basically, using deceit. In the social context that implies manipulating others for your own purposes. I don’t think “Dark Arts” is a useful term in the context of self-motivation.
Huh. Personally, I feel that the need for comments such as this one strongly indicate “Dark” subject matter. It’s interesting to get a different perspective. Thanks!
Techniques that have negative side effects for other people. Economics models that recommend that people should defect more because it’s in their self interest are “dark”.