In order to recruit new EAs, your pitch will almost definitely have to downplay certain areas that many core EAs spend lots of time thinking about.
I think “core” EAs understand and are comfortable with that, so they won’t feel alienated.
-
As long as your goal is to increase that number [of core EAs], you’re going to see very low recruitment rates.
Some of this comes down to what counts as an “EA”. What kind of conversion do we need to do, and how much? I also think I’ll be pretty unsuccessful at getting new core EAs, but what can I get? How hard is it? These are things I’d like to know, and things I believe would be valuable to know.
-
If you want more “total altruistic effort,” go convince people to show more altruistic effort.
So you expect movement building / outreach to be a lot less successful than community building (“inreach”, if you will)?
So you expect movement building / outreach to be a lot less successful than community building (“inreach”, if you will)?
Yes, especially if the same strategies are expected to accomplish both. They’re two very different tasks.
Some of this comes down to what counts as an “EA”. What kind of conversion do we need to do, and how much? I also think I’ll be pretty unsuccessful at getting new core EAs, but what can I get? How hard is it? These are things I’d like to know, and things I believe would be valuable to know.
I think you can convince people to give more of their money away, you can convince people to take the effectiveness of the charity into account, you can convince people to care more about animals or to stop eating meat, and possibly that there are technological risks that are greater than climate change and nuclear war. I don’t think you’ll convince the same person of all of these things. Rather they’ll be individuals that are on board with specific parts and that may or may not identify with EA.
So you expect movement building / outreach to be a lot less successful than community building (“inreach”, if you will)?
Yes, especially if the same strategies are expected to accomplish both. They’re two very different tasks.
I wouldn’t say the same tasks will work equally well for both. But I do think either would have spillover effects for the other. Right now, it seems we’re focused on community building, though.
~
I think you can convince people to give more of their money away, you can convince people to take the effectiveness of the charity into account, you can convince people to care more about animals or to stop eating meat, and possibly that there are technological risks that are greater than climate change and nuclear war. I don’t think you’ll convince the same person of all of these things. Rather they’ll be individuals that are on board with specific parts and that may or may not identify with EA.
I’d be interested in how much overlap there are between these groups. It never was my intention to try and convince people of the entire meme set at once, but I wouldn’t rule it out as implausible. I think better understanding these channels (how people come to these beliefs) is most important.
If you’re talking about recruiting new EAs, it sounds like you mean people that agree enough with the entire meme set that they identify as EAs. Have there been any polls on what percentage of self-identifying EAs hold which beliefs? It seems like the type of low-hanging fruit .impact could pick off. That poll would give you an idea of how common it is for EAs to believe only small portions of the meme set. I expect that people agree with the majority of the meme set before identifying as EA. I believe a lot more than most people and I only borderline identify as EA.
If you formulate a bunch of question we could add the poll question to the next LW census. In general formulatting an EA census might also be a worthwhile project.
I think “core” EAs understand and are comfortable with that, so they won’t feel alienated.
-
Some of this comes down to what counts as an “EA”. What kind of conversion do we need to do, and how much? I also think I’ll be pretty unsuccessful at getting new core EAs, but what can I get? How hard is it? These are things I’d like to know, and things I believe would be valuable to know.
-
So you expect movement building / outreach to be a lot less successful than community building (“inreach”, if you will)?
Yes, especially if the same strategies are expected to accomplish both. They’re two very different tasks.
I think you can convince people to give more of their money away, you can convince people to take the effectiveness of the charity into account, you can convince people to care more about animals or to stop eating meat, and possibly that there are technological risks that are greater than climate change and nuclear war. I don’t think you’ll convince the same person of all of these things. Rather they’ll be individuals that are on board with specific parts and that may or may not identify with EA.
I wouldn’t say the same tasks will work equally well for both. But I do think either would have spillover effects for the other. Right now, it seems we’re focused on community building, though.
~
I’d be interested in how much overlap there are between these groups. It never was my intention to try and convince people of the entire meme set at once, but I wouldn’t rule it out as implausible. I think better understanding these channels (how people come to these beliefs) is most important.
If you’re talking about recruiting new EAs, it sounds like you mean people that agree enough with the entire meme set that they identify as EAs. Have there been any polls on what percentage of self-identifying EAs hold which beliefs? It seems like the type of low-hanging fruit .impact could pick off. That poll would give you an idea of how common it is for EAs to believe only small portions of the meme set. I expect that people agree with the majority of the meme set before identifying as EA. I believe a lot more than most people and I only borderline identify as EA.
If you formulate a bunch of question we could add the poll question to the next LW census. In general formulatting an EA census might also be a worthwhile project.
We’ve just launched one, with Peter’s help: http://lesswrong.com/lw/k60/2014_survey_of_effective_altruists/