Rather, the idea is that by appealing to both the “head” and the “heart” we can convey a fuller message about EA, and that this will amplify our reach among people who otherwise might not know about it or take it seriously.
Coddletrop. This post is talking about dark arts, about bypassing the head entirely. “Superdonor” indeed.
I agree that OP was leaning a bit heavy on the advertising methods, and that advertising is almost 100% appeal to emotion. However, I’m not sure that 0% emotional content is quite right either. (For reasons besides argument to moderation.) Occasionally it is necessary to ground things in emotion, to some degree. If I were to argue that dust specs in 3^^^3 people’s eyes is a huge amount of suffering, I’d likely wind up appealing to empathy for that vastly huge unfathomable amount of suffering. The argument relies almost exclusively on logic, but the emotional content drives the point home.
However, maybe a more concrete example of the sorts of methods EAs might employ will make it clearer whether or not they are a good idea. If we do decide to use some emotional content, this seems to be an effective science-based way to do it: http://blog.ncase.me/the-science-of-social-change/
Aside from just outlining some methods, the author deals briefly with the ethics. They note that children who read George Washington’s Cherry Tree were inspired to be more truthful, while the threats implicit in Pinocchio and Boy Who Cried Wolf didn’t motivate them to lie less than the control group. I have no moral problem with showing someone a good role model, and setting a good example, even if that evokes emotions which influence their decisions. That’s still similar to an appeal to emotion, although the Aristotelian scheme the author mentions would classify it as Ethos rather than Pathos. I’m not sure I’d classify it under Dark Arts. (This feels like it could quickly turn into a confusing mess of different definitions for terms. My only claim is that this is a counterexample, where a small non-rational component of a message seems to be permissible.)
It seems worth noting that EAs are already doing this, to some degree. Here are a couple EA and LW superheroes, off of the top of my head:
Stanislav Petrov day was celebrated here a bit over a month ago, although there are others who arguably averted closer cold war near-misses, but on days less convenient to make a holiday out of.
One could argue that we should only discuss these sorts of people purely for how their stories inform the present. However, if their stories have an aspirational impact, then it seems reasonable to share that. I’d have a big problem if EA turned into a click-maximizing advertising campaign, or launched infomercials. I agree with you there. There are some techniques which we definitely shouldn’t employ. But some methods besides pure reason legitimately do seem advisable. But guilting someone out of pocket change is significantly different from acquiring new members by encouraging them to aspire to something, and then giving them the tools to work toward that common goal. It’s not all framing.
The issue with advertising isn’t just the ethics. Set the ethics of advertising aside. The issue with advertising is that you’re bringing people in on the basis of something other than Effective Altruism.
How many new people could EA successfully culturally inculcate each month? Because that’s the maximum number of people you should be successfully reaching each month. EA is fundamentally a rationalist culture; if you introduce non-rationalists faster than you can teach them rationalism, you are destroying your own culture.
I very much agree. My post was leaning more toward the heart to go against the mainstream. There are plenty of tactics I would not endorse. We shouldn’t lie to people, or tell them that Jesus will cry if they don’t give to effective charities. However, I think it’s important to acknowledge and be ok with using some moderate dark arts to promote rationality and effective altruism. If we can motivate people to engage with the EA movement and put their money toward effective charities by getting them to truly care about effective donations, I think that is a quite justifiable use of moderate dark arts. We can be agentive about meeting our goals.
This post is talking about dark arts, about bypassing the head entirely.
Guess what, that’s what the ‘heart’ responds to. It doesn’t mean you can’t appeal to the head too, it’s just saying that a mixed message won’t work very well. The appeals do have to be largely distinct, albeit they would probably work best if presented together.
Emotions are not in opposition to rationality, and you do not have to bypass rational processes in order to reach the heart. That is the flawed presumption underlying the Spock mentality.
Emotions are not in opposition to rationality, and you do not have to bypass rational processes in order to reach the heart.
This is very true. You only need to bypass the head if you want to sabotage the rational process and manipulate people into something their head would have rejected.
The Spock mentality is about personal decision making, not communication or even influence. The notion that ‘reaching’ System 1 is not something you can do with ordinary, factual communication is quite widely accepted. Even some recent CFAR materials—with their goal factoring approach—are clearly based on this principle.
I think I was clear that we should still use the current EA tactics of appealing to the head, but enrich them by appealing to the heart, to emotions. I think it’s important to acknowledge and be ok with using some moderate dark arts to promote rationality and effective altruism. If we can motivate people to engage with the EA movement and put their money toward effective charities by getting them to truly care about effective donations, I think that is a quite justifiable use of moderate dark arts.
First, emotions != dark arts, or EA would be meaningless as an enterprise.
Second, you’re not getting anybody to care about effective donations, you’re getting them to care about the social status they would attain by being a part of your organization. People who care about social status in this way are going to want more, and they’re better at it than you are. You will lose control.
Sure, emotions = dark arts, but there are shades of darkness, I think we can all agree on that. For example, the statement “emotions != dark arts” relies on a certain emotional tonality to the word “dark arts.”
I’m getting people to care about social status to the extent that they care about effective donations. There is nothing about Intentional Insights itself that they should care about, the organization is just a tool to get them to care about effective giving. The key is to tie people’s caring to effective giving :-)
Sure, emotions = dark arts, but there are shades of darkness, I think we can all agree on that.
No. We can’t. Emotions != dark arts. I say that as somebody who killed his emotions and experienced an emotion-free existence for over a decade in the pursuit of pure rationality. You have no idea what you’re talking about.
the statement “emotions != dark arts” relies on a certain emotional tonality to the word “dark arts.”
No, it does not. It is a statement that there are ways of interacting with emotions that are non-manipulative. Emotions are not in opposition to rationality, and indeed are necessary to it. Emotions are the fundamental drive to our purpose; rationality is fundamentally instrumental. Emotions tell us what we should achieve; rationality tells us how. What makes your approach “dark arts” is that you seek to make people achieve something different from the achievement you are appealing to in them.
I’m getting people to care about social status to the extent that they care about effective donations. There is nothing about Intentional Insights itself that they should care about, the organization is just a tool to get them to care about effective giving. The key is to tie people’s caring to effective giving :-)
You lure people in with one goal, and hope to change the goal they pursue. Have a notion of your own human fallibility, and consider what will happen if you fail. They won’t leave. They will take over, and remake your shining institution in their own image.
Because if you do possess the ability to change people’s goals, you should start there. Convince people that Effective Altruism is worth doing for its own sake. If you can manage that, you don’t need the dark arts in the first place. If you need the dark arts, then you can’t do what you’d need to be able to do to make the results favorable, and shouldn’t use them.
I accept that you believe you killed your emotions. However, I think statements like “you have no idea what you are talking about” indicate a presence of emotions, as that’s a pretty extreme statement. So I think it might be best to avoid further continuing this discussion.
OrphanWilde has told his emotion-killing story elsewhere on LW, and isn’t claiming to have no emotions now but to have spent some time in the past without emotions (having deliberately got rid of them) and found the results very unsatisfactory.
Whether that makes any difference to your willingness to continue the conversation is of course up to you.
If you do possess the ability to change people’s goals, you should start there. Convince people that Effective Altruism is worth doing for its own sake. If you can manage that, you don’t need the dark arts in the first place. If you need the dark arts, then you can’t do what you’d need to be able to do to make the results favorable, and shouldn’t use them.
Coddletrop. This post is talking about dark arts, about bypassing the head entirely. “Superdonor” indeed.
I agree that OP was leaning a bit heavy on the advertising methods, and that advertising is almost 100% appeal to emotion. However, I’m not sure that 0% emotional content is quite right either. (For reasons besides argument to moderation.) Occasionally it is necessary to ground things in emotion, to some degree. If I were to argue that dust specs in 3^^^3 people’s eyes is a huge amount of suffering, I’d likely wind up appealing to empathy for that vastly huge unfathomable amount of suffering. The argument relies almost exclusively on logic, but the emotional content drives the point home.
However, maybe a more concrete example of the sorts of methods EAs might employ will make it clearer whether or not they are a good idea. If we do decide to use some emotional content, this seems to be an effective science-based way to do it: http://blog.ncase.me/the-science-of-social-change/
Aside from just outlining some methods, the author deals briefly with the ethics. They note that children who read George Washington’s Cherry Tree were inspired to be more truthful, while the threats implicit in Pinocchio and Boy Who Cried Wolf didn’t motivate them to lie less than the control group. I have no moral problem with showing someone a good role model, and setting a good example, even if that evokes emotions which influence their decisions. That’s still similar to an appeal to emotion, although the Aristotelian scheme the author mentions would classify it as Ethos rather than Pathos. I’m not sure I’d classify it under Dark Arts. (This feels like it could quickly turn into a confusing mess of different definitions for terms. My only claim is that this is a counterexample, where a small non-rational component of a message seems to be permissible.)
It seems worth noting that EAs are already doing this, to some degree. Here are a couple EA and LW superheroes, off of the top of my head:
Norman Borlough saved a billion lives from starvation by making sweeping improvements in crop yields using industrial agriculture. https://80000hours.org/2011/11/high-impact-science/
Viktor Zhdanov convinced the World Health Assembly, by a margin of only 2 votes, to eradicate Smallpox, saving perhaps hundreds of millions of lives. https://80000hours.org/2012/02/in-praise-of-viktor-zhdanov/
Stanislav Petrov day was celebrated here a bit over a month ago, although there are others who arguably averted closer cold war near-misses, but on days less convenient to make a holiday out of.
One could argue that we should only discuss these sorts of people purely for how their stories inform the present. However, if their stories have an aspirational impact, then it seems reasonable to share that. I’d have a big problem if EA turned into a click-maximizing advertising campaign, or launched infomercials. I agree with you there. There are some techniques which we definitely shouldn’t employ. But some methods besides pure reason legitimately do seem advisable. But guilting someone out of pocket change is significantly different from acquiring new members by encouraging them to aspire to something, and then giving them the tools to work toward that common goal. It’s not all framing.
The issue with advertising isn’t just the ethics. Set the ethics of advertising aside. The issue with advertising is that you’re bringing people in on the basis of something other than Effective Altruism.
How many new people could EA successfully culturally inculcate each month? Because that’s the maximum number of people you should be successfully reaching each month. EA is fundamentally a rationalist culture; if you introduce non-rationalists faster than you can teach them rationalism, you are destroying your own culture.
How do you foresee this going for EA’s culture?
I very much agree. My post was leaning more toward the heart to go against the mainstream. There are plenty of tactics I would not endorse. We shouldn’t lie to people, or tell them that Jesus will cry if they don’t give to effective charities. However, I think it’s important to acknowledge and be ok with using some moderate dark arts to promote rationality and effective altruism. If we can motivate people to engage with the EA movement and put their money toward effective charities by getting them to truly care about effective donations, I think that is a quite justifiable use of moderate dark arts. We can be agentive about meeting our goals.
P.S. Nice username!
Guess what, that’s what the ‘heart’ responds to. It doesn’t mean you can’t appeal to the head too, it’s just saying that a mixed message won’t work very well. The appeals do have to be largely distinct, albeit they would probably work best if presented together.
Emotions are not in opposition to rationality, and you do not have to bypass rational processes in order to reach the heart. That is the flawed presumption underlying the Spock mentality.
This is very true. You only need to bypass the head if you want to sabotage the rational process and manipulate people into something their head would have rejected.
The Spock mentality is about personal decision making, not communication or even influence. The notion that ‘reaching’ System 1 is not something you can do with ordinary, factual communication is quite widely accepted. Even some recent CFAR materials—with their goal factoring approach—are clearly based on this principle.
I think I was clear that we should still use the current EA tactics of appealing to the head, but enrich them by appealing to the heart, to emotions. I think it’s important to acknowledge and be ok with using some moderate dark arts to promote rationality and effective altruism. If we can motivate people to engage with the EA movement and put their money toward effective charities by getting them to truly care about effective donations, I think that is a quite justifiable use of moderate dark arts.
First, emotions != dark arts, or EA would be meaningless as an enterprise.
Second, you’re not getting anybody to care about effective donations, you’re getting them to care about the social status they would attain by being a part of your organization. People who care about social status in this way are going to want more, and they’re better at it than you are. You will lose control.
Sure, emotions = dark arts, but there are shades of darkness, I think we can all agree on that. For example, the statement “emotions != dark arts” relies on a certain emotional tonality to the word “dark arts.”
I’m getting people to care about social status to the extent that they care about effective donations. There is nothing about Intentional Insights itself that they should care about, the organization is just a tool to get them to care about effective giving. The key is to tie people’s caring to effective giving :-)
No. We can’t. Emotions != dark arts. I say that as somebody who killed his emotions and experienced an emotion-free existence for over a decade in the pursuit of pure rationality. You have no idea what you’re talking about.
No, it does not. It is a statement that there are ways of interacting with emotions that are non-manipulative. Emotions are not in opposition to rationality, and indeed are necessary to it. Emotions are the fundamental drive to our purpose; rationality is fundamentally instrumental. Emotions tell us what we should achieve; rationality tells us how. What makes your approach “dark arts” is that you seek to make people achieve something different from the achievement you are appealing to in them.
You lure people in with one goal, and hope to change the goal they pursue. Have a notion of your own human fallibility, and consider what will happen if you fail. They won’t leave. They will take over, and remake your shining institution in their own image.
Because if you do possess the ability to change people’s goals, you should start there. Convince people that Effective Altruism is worth doing for its own sake. If you can manage that, you don’t need the dark arts in the first place. If you need the dark arts, then you can’t do what you’d need to be able to do to make the results favorable, and shouldn’t use them.
I accept that you believe you killed your emotions. However, I think statements like “you have no idea what you are talking about” indicate a presence of emotions, as that’s a pretty extreme statement. So I think it might be best to avoid further continuing this discussion.
OrphanWilde has told his emotion-killing story elsewhere on LW, and isn’t claiming to have no emotions now but to have spent some time in the past without emotions (having deliberately got rid of them) and found the results very unsatisfactory.
Whether that makes any difference to your willingness to continue the conversation is of course up to you.
I’ll repeat:
If you do possess the ability to change people’s goals, you should start there. Convince people that Effective Altruism is worth doing for its own sake. If you can manage that, you don’t need the dark arts in the first place. If you need the dark arts, then you can’t do what you’d need to be able to do to make the results favorable, and shouldn’t use them.