If one can’t or doesn’t want to avoid Dark Arts in some circumstances then I’m clearly asking if the people on LW should or can discuss and study their use for this application.
Are you saying social skills are the same as the dark arts?
I am alluding to a distinct overlap between social skills and that which is labelled the ‘dark arts’. This is particularly the case when instinctive and emotion driven behaviors have been driven to the level of self awareness.
(I incidentally note that ‘dark arts’ includes features that are more or less necessary for healthy human interaction. “Dark Arts” are often beneficial for the person you are interacting with and sometimes expected as a courtesy.)
Can you give examples, either of emotion driven behaviors becoming dark arts when raised to awareness, or of dark arts being necessary to healthy interaction? I think we are using different definitions of what dark arts are.
On the wiki the dark arts are defined as exploiting the biases of others so that they behave irrationally. This is morally wrong—I want others not only to accept the right answers, but to accept them for the right reasons.
If you’re classifying the intentional use of human biases as wrong in a terminal moral sense, there’s not much more to be said other than that I don’t share your moral values, not even when you format them in italics.
If you’re instead claiming they are wrong in some instrumental sense—that is, that they lead to bad results—I’d like to understand how you derive that.
In other words: suppose I want to convince people to do something, or to stop doing something, or to feel a certain way or stop feeling a certain way, or some other X. Suppose I then convince people to X by using the “dark arts” and “exploiting the biases of others.”
For example, suppose I want someone to think that making use of human biases is a bad thing, and so I label that activity using words with negatively weighted denotations like “exploit” and “dark.”
Let’s say I’m working with Bob. By exploiting his cognitive biases, I can convince him to do two things that I value. Without such exploitation, I can only convince him to do one. If I do exploit his biases, these bad things happen:
I have less confidence that either of the two things were actually worthwhile.
It is more likely that my enemy will be able to convince Bob to undo the valuable things he did.
I have less trust in Bob in the future, and his total value to me is reduced.
In some cases these effects might outweigh the value of getting two things done rather than one.
I don’t accept disagreement with Eugine_Nier as a ‘problem’.
you have too broad a notion of what constitutes “dark arts”.
There is a time and a place for each of the following:
“Grey” arts.
Dark arts as defined on the wiki.
The alternate version of ‘dark arts’ that nerzhin presented.
Further, there are instances in each category where the use of dark arts is pro-social. It seems that the term ‘dark arts’ has become a hindrance to understanding instead of a help. It does not mean evil!
I agree that manipulating Bob makes it hard to rely on Bob for “sanity checks” of my motives, and that that’s a significant loss if Bob would otherwise have been useful in that capacity.
And I can sorta see how it might be true in some cases that manipulating Bob might render him more manipulable by others, and therefore less valuable to me, than he would have been had I not manipulated him. (I have trouble coming up with a non-contrived example, though, so I’m not convinced.)
So, yes, agreed: in cases like those, it makes things worse.
For example, suppose I want someone to think that making use of human biases is a bad thing, and so I label that activity using words with negatively weighted denotations like “exploit” and “dark.”
What have I made worse, by so doing?
Now that was well done. Although technically you would have counter-factually made the universe worse according to your values. You will have lost a small measure of respect from your audience and by so doing reduced your social influence—a critical instrumental resource. Even worse your credibility will have decreased most specifically when it comes to moral authority.
For example, suppose I want someone to think that making use of human biases is a bad thing, and so I label that activity using words with negatively weighted denotations like “exploit” and “dark.”
What have I made worse, by so doing?
You seem to be confusing rationality with suppression of emotion. As Eliezer points out here, there is nothing wrong with feeling emotions that accurately correspond to the territory. There is similarly nothing wrong with promoting such emotions in others. The “dark arts” really are things humans should avoid using, as such there is nothing wrong with associating them with negative emotions.
I’m not sure where you got that from, at all. I am talking neither about rationality nor emotion.
I am talking about “exploiting the biases of others so that they behave irrationally”—in this example, influencing someone’s decision about whether or not to do something based neither on the consequences of doing it nor their pre-existing deontological commitments, but rather on the connotations of the metaphorical language I’ve chosen to describe it with.
That those connotations are emotional is incidental.
The description on the wiki does put a negative spin on it (although not quite as negative as you do—behavior is not even mentioned). From his description I get the impression that Konkvistador is also including ‘grey arts’ too.
This is morally wrong
I reject this moral proscription and any other moral proscription that would make becoming more rational a form of self sabotage.
I reject this moral proscription and any other moral proscription that would make becoming more rational a form of self sabotage.
Given that you suck at dark arts, as demonstrated by the fact that you openly admit on a public forum that you’re willing to use them, I don’t see how this moral proscription is a form of self sabotage.
Given that you suck at dark arts, as demonstrated by the fact that you openly admit on a public forum that you’re willing to use them, I don’t see how this moral proscription is a form of self sabotage.
Your attitude is objectionable and your understanding of signalling strategy lacks nuance.
Are you sure the beliefs you’re using dark arts to promote are correct? If a belief you’re promoting turns out to be wrong, it’ll be nearly impossible to back paddle. Read this for a more detailed description.
If natural selection, which doesn’t care at all about the welfare of unrelated strangers, still manages to give you a sense of ethical unease on account of transgressive plans not always going as planned—then how much more reluctant should you be to rob banks for a good cause, if you aspire to actually help and protect others?
Full understanding (and frequent implementation of) of the concept of ethical inhibition does not lead me to accept naive signalling beliefs indiscriminately.
why would anyone want to avoid employing the dark arts as a general rule?
If one can’t or doesn’t want to avoid Dark Arts in some circumstances then I’m clearly asking if the people on LW should or can discuss and study their use for this application.
Yes! Social skills are healthy. :)
Are you saying social skills are the same as the dark arts?
I am alluding to a distinct overlap between social skills and that which is labelled the ‘dark arts’. This is particularly the case when instinctive and emotion driven behaviors have been driven to the level of self awareness.
(I incidentally note that ‘dark arts’ includes features that are more or less necessary for healthy human interaction. “Dark Arts” are often beneficial for the person you are interacting with and sometimes expected as a courtesy.)
Can you give examples, either of emotion driven behaviors becoming dark arts when raised to awareness, or of dark arts being necessary to healthy interaction? I think we are using different definitions of what dark arts are.
On the wiki the dark arts are defined as exploiting the biases of others so that they behave irrationally. This is morally wrong—I want others not only to accept the right answers, but to accept them for the right reasons.
Also, the dark arts are, well, dark.
If you’re classifying the intentional use of human biases as wrong in a terminal moral sense, there’s not much more to be said other than that I don’t share your moral values, not even when you format them in italics.
If you’re instead claiming they are wrong in some instrumental sense—that is, that they lead to bad results—I’d like to understand how you derive that.
In other words: suppose I want to convince people to do something, or to stop doing something, or to feel a certain way or stop feeling a certain way, or some other X. Suppose I then convince people to X by using the “dark arts” and “exploiting the biases of others.”
For example, suppose I want someone to think that making use of human biases is a bad thing, and so I label that activity using words with negatively weighted denotations like “exploit” and “dark.”
What have I made worse, by so doing?
Let’s say I’m working with Bob. By exploiting his cognitive biases, I can convince him to do two things that I value. Without such exploitation, I can only convince him to do one. If I do exploit his biases, these bad things happen:
I have less confidence that either of the two things were actually worthwhile.
It is more likely that my enemy will be able to convince Bob to undo the valuable things he did.
I have less trust in Bob in the future, and his total value to me is reduced.
In some cases these effects might outweigh the value of getting two things done rather than one.
Nobody doubts that doing stupid or ill-considered things with the dark arts could have undesirable consequences.
Note: the parent is another example of a dark arts persuasion technique.
I think your problem, is you have too broad a notion of what constitutes “dark arts”.
I don’t accept disagreement with Eugine_Nier as a ‘problem’.
There is a time and a place for each of the following:
“Grey” arts.
Dark arts as defined on the wiki.
The alternate version of ‘dark arts’ that nerzhin presented.
Further, there are instances in each category where the use of dark arts is pro-social. It seems that the term ‘dark arts’ has become a hindrance to understanding instead of a help. It does not mean evil!
I agree that manipulating Bob makes it hard to rely on Bob for “sanity checks” of my motives, and that that’s a significant loss if Bob would otherwise have been useful in that capacity.
And I can sorta see how it might be true in some cases that manipulating Bob might render him more manipulable by others, and therefore less valuable to me, than he would have been had I not manipulated him. (I have trouble coming up with a non-contrived example, though, so I’m not convinced.)
So, yes, agreed: in cases like those, it makes things worse.
Now that was well done. Although technically you would have counter-factually made the universe worse according to your values. You will have lost a small measure of respect from your audience and by so doing reduced your social influence—a critical instrumental resource. Even worse your credibility will have decreased most specifically when it comes to moral authority.
(nods) Agreed, given that my audience is such that I lose more respect/influence/authority than I gain. Which some audiences are.
You seem to be confusing rationality with suppression of emotion. As Eliezer points out here, there is nothing wrong with feeling emotions that accurately correspond to the territory. There is similarly nothing wrong with promoting such emotions in others. The “dark arts” really are things humans should avoid using, as such there is nothing wrong with associating them with negative emotions.
I’m not sure where you got that from, at all. I am talking neither about rationality nor emotion.
I am talking about “exploiting the biases of others so that they behave irrationally”—in this example, influencing someone’s decision about whether or not to do something based neither on the consequences of doing it nor their pre-existing deontological commitments, but rather on the connotations of the metaphorical language I’ve chosen to describe it with.
That those connotations are emotional is incidental.
The description on the wiki does put a negative spin on it (although not quite as negative as you do—behavior is not even mentioned). From his description I get the impression that Konkvistador is also including ‘grey arts’ too.
I reject this moral proscription and any other moral proscription that would make becoming more rational a form of self sabotage.
Given that you suck at dark arts, as demonstrated by the fact that you openly admit on a public forum that you’re willing to use them, I don’t see how this moral proscription is a form of self sabotage.
Your attitude is objectionable and your understanding of signalling strategy lacks nuance.
Are you sure the beliefs you’re using dark arts to promote are correct? If a belief you’re promoting turns out to be wrong, it’ll be nearly impossible to back paddle. Read this for a more detailed description.
That strikes me more as an excuse to say avoiding the dark arts is the desirable thing to do than an actual reason.
As Eliezer writes here
but seriously read the whole article.
Full understanding (and frequent implementation of) of the concept of ethical inhibition does not lead me to accept naive signalling beliefs indiscriminately.