I work at CEA, and I recently became the Interim EA Forum Project Lead. I’m writing this in a personal capacity. This does not necessarily represent the views of anyone else at CEA.
I’m responding partly because my new title implies some non-zero amount of “EA leadership”. I don’t think I’m the person anyone would think of when they think “EA leadership”, but I do in fact have a large amount of say wrt what happens on the EA Forum, so if you are seriously interested in making change I’m happy to engage with you. You’re welcome to send me a doc and ask me to comment, and/or if you want to have a video call with me, you can DM me and I’ll send you a link.
Hi Elizabeth. I wanted to start by saying that I’m sorry you feel betrayed by EA. I’m guessing I have not felt any betrayal that painful in my own life, and I completely understand if you never want to interact with EA again. I don’t think EA is right for everyone, and I have no desire to pressure anyone into doing something they would regret.
I have some thoughts and reactions to the things you (and Timothy) said. On a meta level, I want to say that you are very welcome not to engage with me at all. I will not judge you for this, nor should any readers judge you. I am not trying to burden you to prove yourself to me or to the public.
I have three main goals in writing this:
Because I am new to this role, I think I have a lot to learn about how to best run the EA Forum. It sounds like both of you have thought a lot about how the EA community can improve (or at least, how it has failed, in your eyes). Essentially, it seems like it is in both of our best interests to talk about this with each other.
You don’t know me, but in my opinion I take truth-seeking seriously. I feel confused when reading/listening to your account, because it seems like our experiences differ in some ways, and I’m not sure if one of us is factually incorrect, or if we agree on the facts and have different standards or definitions for terms like “integrity”, or some other thing I haven’t thought of. So another goal I have is to highlight places where I am concerned about there being inaccuracies, outdated information, or I just have a personal disagreement.
For example, there are relatively few times in the interview where Timothy questions or challenges your points. It’s possible that the goal of this interview was not to be truth-seeking, but instead just to communicate your perspective. If so, then I have no problem with that, but as a reader I would find that surprising and I would suggest that you be more explicit about the goals of the interview to avoid misleading others.
I still think my statement above is factually correct, but I certainly don’t think Elizabeth is at fault for any of the host’s actions, and I would like to avoid implying that. I think it’s customary on LW to leave in edits as crossed out text so I’ll do that here.
I will emphasize again that, even though I am starting a conversation, you are very welcome to ignore my comments and move on with your life. :)
As referenced in another comment, under Zach, CEA is taking more of a stewardship role towards EA and I think CEA being more open is an important part of that. So in the spirit of feeling more responsible for EA, I think it is valuable for someone at CEA to engage with this publicly.
I listened to the video and read the transcript, so I’ll structure much of this as responding to quotes from the transcript.
RE: “recruiting heavily and dogmatically among college students”:
I’m certainly no expert, but my understanding is that, while this is a relatively accurate description of how things worked when there was FTX money available, things have been significantly different since then. For example, Jessica McCurdy is Head of Groups at CEA (and I believe she took this role after FTX) and wrote this piece about potential pitfalls in uni group organizing, which includes points about creating truth-seeking discussions and finding the right balance of openness to ideas. I would say that this is some evidence that currently, recruiting is more careful than you describe, because, as Head of Groups at CEA, her views are likely a significant influence on uni group culture.
I wasn’t involved with EA in college, but my own relevant experience is in Virtual Programs. I’ve participated in both the Intro and Advanced courses, plus facilitated the Intro course once myself. In my opinion, both myself and the other facilitators were very thoughtful about not being dogmatic, and not pressuring participants into thinking or acting in specific ways. I also talk with a fair number of younger people at conferences who are looking for advice, and something I have repeated many times is that young people should be really careful with how involved with EA they get, because it’s easy to accidentally get too involved (ex. all your friends are EAs). I’ve encouraged multiple young people not to take jobs at EA orgs. As I alluded to above, I really do not want to pressure anyone into doing something that they would regret.
RE: “the way EA is doing it can’t filter and inform the way healthy recruiting needs to”
I’d be really curious to hear more about what you mean by this, especially if it is unrelated to Jessica’s piece above.
RE: “if I believe that EA’s true values, whatever that means, are not like in high integrity or not aligned with the values I want it to have, then I’m not going to want to lend my name to the movement”
I agree with this. I certainly internally struggled with this following FTX. However, in my experience of meeting people in a variety of EA contexts, from different places around the world, I would say that they are far more aligned with my values than like, people on average are. This is particularly clear when I compare the norms of my previous work places with the norms of CEA. I’ll quote myself from this recent comment:
“When I compare my time working in for-profit companies to my time working at CEA, it’s pretty stark how much more the people at CEA care about communicating honestly. For example, in a previous for-profit company, I was asked to obfuscate payment-related changes to prevent customers from unsubscribing, and no one around me had any objection to this.”
Perhaps more importantly, in my opinion, speaking for no one else, I think Zach in particular shares these values. In a different recent comment, I was responding to a critical article about the Forum and tried to clarify how much staff time put towards the Forum costs. I thought my original comment was open/clear about these costs, but Zach felt that it was misleading, because it did not talk about the indirect overheads that CEA pays per employee, and this could lead readers to think that our team is more cost-effective than it actually is. You can read more in my EDIT section, which I added based on his suggestion. I personally think this is an example of Zach having high integrity and being truth-seeking, and after this exchange I personally updated towards being more optimistic about his leadership of CEA. Of course you can’t judge any person on a single action, so just like in any other context, you should only think of this as one data point.
RE: “I had had vague concerns about EA for years, but had never written them up because I couldn’t get a good enough handle on it. It wouldn’t have been crisp, and I had seen too many people go insane with their why I left EA backstories. I knew there were problems but couldn’t articulate them and was in, I think, a pretty similar state to where you are now. Then I found a crisp encapsulation where I could gather data and prove my point and then explain it clearly so everyone could see it.”
I would be very interested to read a crisp encapsulation. Apologies if I missed it, but I didn’t see any specific concerns about EA overall that rise to the level of like, preventing EA from reaching its potential for improving the world, either in your transcript or in your two linked articles in the video description. (Perhaps this is a misunderstanding on my part — perhaps you are highlighting problems that you don’t see as very severe for EA overall, but you left EA because of the lack of response to your writing rather than the severity of the problems?)
This post seemed like a lot of work, and I appreciate that you did that. I skimmed it to remind myself of the contents, and I generally agree with it. I pretty strongly agree with our Forum norms which I feel have a similar spirit. Some example snippets:
Don’t mislead or manipulate.
Aim to inform, rather than persuade.
Clarity about what you believe, your reasons for believing it (this could be “I have this intuition for some reason I can’t quite track”), and what would cause you to change your mind.
Note that I am not vegan and I don’t know much about the base facts around veganism and nutrition.
I also don’t recall any time in which someone in an EA context pressured me to be vegan. The closest thing was a student at an EAGx was visibly disappointed when I told them I wasn’t vegan, which did make me feel bad.
I think veganism can get tied up in people’s identities and can come with baggage. This can make it particularly hard to be truth-seeking around. I think most topics discussed in EA contexts have significantly less baggage, so while I agree that this is concerning, I don’t view this specific example as being strong evidence of a severe broader problem.
I think it’s a very helpful flag that we should be keeping an eye on topics that have some baggage. My understanding is that the Forum’s moderation team does this to some extent, but they are relatively hands-off (for example, they are not required to read everything posted during their on-call week), and so perhaps we should revisit the responsibilities of this role and consider if they should be more hands-on. If you have thoughts about this please let me know.
From your transcript: “EA had failed in noticing these obvious lies that everyone knew”
I agree that, if this is happening, this is bad. I’ll say that I’m not a fan of this exaggerated framing, and I find it quite anti-truth-seeking. But to address the actual point: if you ask me “Sarah, how can you be sure this is not currently happening?” I would say that I certainly cannot prove that this is not happening, since I am privy to an insignificantly small percentage of communications that happen in EA contexts — like, I barely have time to read the Forum. I end up talking with a fair number of people in EA contexts for my job, and I personally haven’t noticed signs of anything in this category happening recently.
Actually, I have a specific suggestion that might address your concerns here. If you notice something in this category happening again, and you want someone to do something about it, you can reach out to me directly. For example, if you think “no one is saying this thing [on the Forum] because everyone is too scared”, well I am probably not too scared, so I can just post something about it with my account.
On the specific points, I think sometimes your bar is too high. For example, the “Ignore the arguments people are actually making” section really resonated with me, because I also find it annoying when people do that. But I think that’s just how humans work. In my experience, people are probably somewhat better at this than average in EA contexts, and people in rationalist contexts are not better at this than people in EA contexts. I certainly appreciate the reminder and encouragement to do better, and I think it’s always good to strive to do better, but IMO this is pretty intractable.
I don’t spend a ton of time on LW so perhaps those users are overall better at these things than the EA community is. However I still find your bar for the “acceptable level of truth-seeking” pretty vague, and I will say that my in-person interactions with rationalists have not impressed me[1] wrt their ability to be truth-seeking (like, they care about this more than most people, which I appreciate, but I’ve seen them make plenty of mistakes).
Our team actually recently curated this post on the Forum without knowing about your interview 🙂
I like and generally agree with the points you make about how to be truth-seeking, and about why that is vitally important (as does Will who curated the post).
I think it’s pretty sus for someone’s reaction to be “I already do all the things you suggest”, but I kinda want to say something in that direction (at least I want to say that I agree these are good and I strive to do them, though perhaps not in the same ways or to the same extent that you think people should). I don’t want to make this novel of a comment any longer by providing evidence of my actions, but I’m happy to dive into it if anyone is interested.
After re-reading your “EA Vegan Advocacy” post, my guess is that you don’t intend this “Truthseeking…” post to provide concrete evidence/data of your concerns about EA. Please let me know if I’m mistaken about this.
RE: “I would be delighted if that happened. But I think it gets harder to do that every year. As EA dilutes itself doubling with college students every two years.”
It depends on how you define who is “EA”, but based on recent data I have seen, the major growth was in 2022. Therefore, I think it’s broadly accurate to say that EA doubled between 2020 and 2022, but I don’t think it’s accurate to say that about any time after 2022. In particular, after FTX, growth has slowed pretty dramatically across basically anything you’d consider to be EA.
I think the claim that “the doubling is due to college students” needs evidence to support it. My understanding is that many EA groups got more resources and grew around 2022, not just university groups. And the marketing that drove a lot of the growth in 2022 was not aimed at college students, so I don’t see why, for example, the increase in EA Forum users would all be from college students.
RE: “but EA makes a lot of noise about arguing within itself and yet there’s a reason so much criticism on EA forum is anonymous And the criticism on LessWrong is all done under people’s stable pseudonym or real name.”
There are probably many reasons for this, and I think one is that there are less power structures within the rationalist community (I don’t think there is the equivalent of OP — maybe Eliezer?). The EA community has more major funders. I think people in the EA community also tend to be more risk-averse than LWers, so they are more likely to default to writing anonymously. And I believe that culturally, LW socially rewards blunt criticism more than the EA Forum does, so there is extra incentive to do that since it’s more likely to be good for your social status. On the other hand, my impression is that the EA community is more welcoming of criticizing people/orgs in power (in the community, not in the world) than LW is, so it’s possible that criticism on the EA Forum has more teeth. For example, there is a fair amount of criticism of OP and CEA that is well-received on the Forum, and I don’t know of a similar situation for Eliezer and Lightcone on LW (I’m less familiar with LW so definitely feel free to correct me if I’m mistaken here). So in some sense, I think more anonymity is a result of the fact that holding power in EA is more consequential, and it is more socially acceptable to criticize those with actual power in the community. To be clear, I’m not trying to encourage people to be scared of posting criticism with their real name, and I don’t know of any specific cases where someone was harmed from doing that, but I just think that it’s reasonable for a person who doesn’t know the consequences of posting criticism to default to doing so anonymously.
In my opinion, it’s not clear that, in an ideal world, all criticism would be written using someone’s real name. I feel pretty uncertain about this, but it seems to me that we should be supportive of people sharing criticism even if they have personal reasons for staying anonymous. Personally, I really appreciate receiving feedback in general, and I would prefer someone give it to me anonymously than not at all.
RE: “E: Which EA leaders do you most resonate with? T: …it’s a list of friends…most of my friends don’t actually want to be talked about in public. I think it speaks against the health of the EA movement right now”
I agree with this, which is why I feel optimistic about CEA putting significant resources towards EA communications work. My guess is that this will be some combination of, being more open and publicly communicating about CEA itself, and putting more effort into proactively writing about what EA is and what people in the community are doing, for a broader audience.
RE: “I would suggest that if you don’t care about the movement leaders who have any steering power. You’re, you’re not in that movement.“
I will say that I identified as an (aspiring) effective altruist for many years before I could name or identify any EA leaders (I only started learning names when I was hired at CEA). I simply found the core principles very compelling, and occasionally read some relevant articles, and donated money via GiveWell and some other places. You could argue that I wasn’t “in the movement”, but I do keep my identity small and the principles resonated with me enough to add that identity.
RE: “ea would be better served by the by having leadership that actually was willing to own their power more”
I would say that under Zach, CEA as an institution is taking more of a leadership role than it previously had been (which was basically none), and people within CEA are more empowered to “own our power” (for example, the Forum team will likely do more steering than before, which again was minimal). EDIT: Based on responses, I’m a bit worried that this is misleading. So I will disconnect this point from the actual quote, and add some clarifications.
RE: “if you’re going to have a big enough movement, you actually want leaders who are. Not just leading in their shadow but are like hey, I feel called to lead which means like i’m gonna like Be the scapegoat and i’m gonna be the one like making big level observations and if you come to me with a problem I’m gonna try to align the integrity of our entire community around The truth.”
I do expect this means that CEA will be the scapegoat more often. I also expect CEA will put more resources towards high level observations about the EA community (for example, I referenced data about EA growth earlier, which came from a CEA project). I’m less sure about that last point, because EA is ultimately a framework (like the scientific method is), and we can’t be sure how many people are out there inspired by EA principles but do not, like, read the EA Forum. I guess you could just define “the entire community” as “people who read the EA Forum” to circumvent that issue. In which case, I expect CEA to do more of that going forward.
Concluding thoughts
I understand that this is an emotional topic for you; however, I was surprised at the amount of anti-truth-seeking I saw in your responses. (My bar for you is perhaps unreasonably high because you write about truth-seeking a lot, and I assume you hold yourself to the standards that you preach.) For example, I think the state of EA university group recruiting is a key factor in your beliefs, but it also seems like you do not have up-to-date information (nor did you attempt to find that information before stating your internal view as if it was fact). You often use exaggerated language (for example, “EA had failed in noticing these obvious lies that everyone knew”, which is certainly not literally true), which I think is actually quite harmful for truth-seeking. Outside of the “EA Vegan Advocacy” post, I see surprisingly few instances of you publicly thinking through ways in which you might be wrong, or even gesturing towards the possibility that you might be wrong, or at least hinting at what your confidence levels are. I genuinely want to engage with your concerns about EA, but I feel like this post (even together with your two posts linked above) is not epistemically legible[2] enough for me to do that. I can’t find a clear core claim to grasp onto.
“Integrity” is a concept that comes up a lot in the interview. I haven’t really addressed it in my comment so I figured I should do so here. Personally I have some complicated/unresolved feelings about what integrity actually means[3], so I don’t know to what extent I have it. I’m happy to dive into that if anyone is interested. If you want to test me and tell me objectively how much integrity I have, I’m open to that — that sounds like it would be helpful for me to know as well. :)
To close, I’ll just caveat that I spent a long time writing this comment, but because I wrote about so many things, I wouldn’t be surprised if I said something that’s wrong, or if I misunderstood something that was said, or if I change my mind about something upon further reflection. I’m generally happy to receive feedback, clarify anything I said that was unclear, and discuss these issues further. Specifically, I have significant influence over the EA Forum, so I would be particularly interested to discuss issues and improvements focused on that project.
For example, I think people sometimes mix up the concept of “having integrity” with the concept of “acting in the same way that I would” or even “acting in a way that I would find reasonable”, but my understanding is that they are distinct. I’m quite unsure about this though so I could certainly be wrong!
There’s a lot here and if my existing writing didn’t answer your questions, I’m not optimistic another comment will help[1]. Instead, how about we find something to bet on? It’s difficult to identify something both cruxy and measurable, but here are two ideas:
I see a pattern of: 1. CEA takes some action with the best of intentions 2. It takes a few years for the toll to come out, but eventually there’s a negative consensus on it. 3. A representative of CEA agrees the negative consensus is deserved, but since it occurred under old leadership, doesn’t think anyone should draw conclusions about new leadership from it. 4. CEA announces new program with the best of intentions.
So I would bet that within 3 years, a CEA representative will repudiate a major project occurring under Zach’s watch.
I would also bet on more posts similar to Bad Omens in Current Community Building or University Groups Need Fixing coming out in a few years, talking about 2024 recruiting.
Thanks! I’m down to bet, though I don’t feel like it would make sense for me to take either of those specific bets. I feel pretty clueless about whether “a CEA representative will repudiate a major project occurring under Zach’s watch”. I guess I think it’s reasonable for someone who was just hired at CEA to not to be held personally responsible for projects that started and ended before they were hired (though I may be misunderstanding your proposed bet). I also have very little information about the current state of EA university group recruiting, so I wouldn’t be that surprised if “more posts similar to Bad Omens in Current Community Building or University Groups Need Fixing coming out in a few years, talking about 2024 recruiting”. TBH I’m still not clear on what we disagree about, or even whether we actually disagree about anything. 😅
Apologies if I wasn’t clear about this, but my main comment was primarily a summary of my personal perspective, which is based on a tiny fraction of all the relevant information. I’m very open to the possibility that, for example, EA university group recruiting is pressuring students more than I would find appropriate. It’s just that, based on the tiny fraction of information I have, I see no evidence of that and only see evidence of the opposite. I would be really interested to hear if you have done a recent investigation and have evidence to support your claims, because you would have a fair chance of convincing me to take some action.
Anyway, I appreciate you responding and no worries if you want to drop this. :) My offer to chat synchronously still stands, if you’re ever interested. Though since I’m in an interim position, I’m not sure how long I will have the “EA Forum Project Lead” title.
RE: “ea would be better served by the by having leadership that actually was willing to own their power more”
I would say that under Zach, CEA as an institution is taking more of a leadership role, and people within CEA are more empowered to “own our power”.
I think this would be a mistake (or more likely I think you and Elizabeth mean different things here.)
As you mention in other parts of your comment, most people who consider themselves aligned with EA don’t know or care much about CEA, and coupling their alignment with EA as principles with an alignment with CEA as an organization seems counterproductive.
Ah interesting, yeah it’s certainly possible that I misunderstood Elizabeth here. Apologies if that’s the case!
I’ll try to explain what I mean more, since I’m not sure I understand how my interpretation differs from Elizabeth’s original intent. So in the past, CEA’s general stance was one more like “providing services” to help people in the EA community improve the world. Under Zach, we are shifting in the direction of “stewardship of EA”. I feel that this implies CEA should be more proactive and take more responsibility for the trajectory of EA than it has in the past (to be clear, I don’t think this means we should try to be the sole leader, or give people orders, or be the only voice speaking for EA). One concrete example is about how much steering the Forum team does: in the past, I would have been more hesitant to steer discussions on the Forum, but now it feels more appropriate (and perhaps even necessary) for the Forum team to be more opinionated and steer discussions in that space.
coupling their alignment with EA as principles with an alignment with CEA as an organization
Sorry, I don’t feel like I understand this point — could you expand on this, or rephrase?
My understanding is that you agree with me, while Elizabeth would want effective altruism to be uppercase in a sense, with a package of particular views that she can clearly agree or disagree with, and an EA Leader that says “this is EA” and “this is not EA.” (Apologies if I misunderstood your views)
“CEA as an institution is taking more of a leadership role” could be interpreted as saying that CEA is now more empowered to be the “EA Leader” that decides what is EA, but I think that’s not what you mean from the rest of your comment.
I think these are principles that most people disagree with, and most people are importantly wrong.
I think they are directionally importantly right in my particular social context (while of course they could be dangerous in other theoretical contexts)
Seeing my statements reflected back is helpful, thank you.
I think Effective Altruism isupper case and has been for a long time, in part because it aggressively recruited people who wanted to follow[1]. In my ideal world it both has better leadership and needs less of it, because members are less dependent.
I think rationality does a decent job here. There are strong leaders of individual fiefdoms, and networks of respect and trust, but it’s much more federated.
Which is noble and should be respected- the world needs more followers than leaders. But if you actively recruit them, you need to take responsibility for providing leadership.
Thanks, that’s very helpful! Yeah I believe you’ve correctly described my views. To me, EA is defined by the principles. I’ll update my original comment, since now it seems that bit is misleading.
(I still think there is something there that gestures in the direction that Elizabeth is going. When I say “CEA is taking more of a leadership role”, I simply mean that literally — like, previously CEA was not viewing itself as being in a leadership role, and now it is doing that a non-zero amount. I think it matters that someone views themselves as even slightly responsible for the trajectory of EA, and you can’t really be responsible without wielding some power. So that’s how I read the “willing to own their power more” quote.)
I work at CEA, and I recently became the Interim EA Forum Project Lead. I’m writing this in a personal capacity. This does not necessarily represent the views of anyone else at CEA.
I’m responding partly because my new title implies some non-zero amount of “EA leadership”. I don’t think I’m the person anyone would think of when they think “EA leadership”, but I do in fact have a large amount of say wrt what happens on the EA Forum, so if you are seriously interested in making change I’m happy to engage with you. You’re welcome to send me a doc and ask me to comment, and/or if you want to have a video call with me, you can DM me and I’ll send you a link.
Hi Elizabeth. I wanted to start by saying that I’m sorry you feel betrayed by EA. I’m guessing I have not felt any betrayal that painful in my own life, and I completely understand if you never want to interact with EA again. I don’t think EA is right for everyone, and I have no desire to pressure anyone into doing something they would regret.
I have some thoughts and reactions to the things you (and Timothy) said. On a meta level, I want to say that you are very welcome not to engage with me at all. I will not judge you for this, nor should any readers judge you. I am not trying to burden you to prove yourself to me or to the public.
I have three main goals in writing this:
Because I am new to this role, I think I have a lot to learn about how to best run the EA Forum. It sounds like both of you have thought a lot about how the EA community can improve (or at least, how it has failed, in your eyes). Essentially, it seems like it is in both of our best interests to talk about this with each other.
You don’t know me, but in my opinion I take truth-seeking seriously. I feel confused when reading/listening to your account, because it seems like our experiences differ in some ways, and I’m not sure if one of us is factually incorrect, or if we agree on the facts and have different standards or definitions for terms like “integrity”, or some other thing I haven’t thought of. So another goal I have is to highlight places where I am concerned about there being inaccuracies, outdated information, or I just have a personal disagreement.
For example, there are relatively few times in the interview where Timothy questions or challenges your points. It’s possible that the goal of this interview was not to be truth-seeking, but instead just to communicate your perspective. If so, then I have no problem with that, but as a reader I would find that surprising and I would suggest that you be more explicit about the goals of the interview to avoid misleading others.I still think my statement above is factually correct, but I certainly don’t think Elizabeth is at fault for any of the host’s actions, and I would like to avoid implying that. I think it’s customary on LW to leave in edits as crossed out text so I’ll do that here.
I will emphasize again that, even though I am starting a conversation, you are very welcome to ignore my comments and move on with your life. :)
As referenced in another comment, under Zach, CEA is taking more of a stewardship role towards EA and I think CEA being more open is an important part of that. So in the spirit of feeling more responsible for EA, I think it is valuable for someone at CEA to engage with this publicly.
I listened to the video and read the transcript, so I’ll structure much of this as responding to quotes from the transcript.
RE: “recruiting heavily and dogmatically among college students”:
I’m certainly no expert, but my understanding is that, while this is a relatively accurate description of how things worked when there was FTX money available, things have been significantly different since then. For example, Jessica McCurdy is Head of Groups at CEA (and I believe she took this role after FTX) and wrote this piece about potential pitfalls in uni group organizing, which includes points about creating truth-seeking discussions and finding the right balance of openness to ideas. I would say that this is some evidence that currently, recruiting is more careful than you describe, because, as Head of Groups at CEA, her views are likely a significant influence on uni group culture.
I wasn’t involved with EA in college, but my own relevant experience is in Virtual Programs. I’ve participated in both the Intro and Advanced courses, plus facilitated the Intro course once myself. In my opinion, both myself and the other facilitators were very thoughtful about not being dogmatic, and not pressuring participants into thinking or acting in specific ways. I also talk with a fair number of younger people at conferences who are looking for advice, and something I have repeated many times is that young people should be really careful with how involved with EA they get, because it’s easy to accidentally get too involved (ex. all your friends are EAs). I’ve encouraged multiple young people not to take jobs at EA orgs. As I alluded to above, I really do not want to pressure anyone into doing something that they would regret.
RE: “the way EA is doing it can’t filter and inform the way healthy recruiting needs to”
I’d be really curious to hear more about what you mean by this, especially if it is unrelated to Jessica’s piece above.
RE: “if I believe that EA’s true values, whatever that means, are not like in high integrity or not aligned with the values I want it to have, then I’m not going to want to lend my name to the movement”
I agree with this. I certainly internally struggled with this following FTX. However, in my experience of meeting people in a variety of EA contexts, from different places around the world, I would say that they are far more aligned with my values than like, people on average are. This is particularly clear when I compare the norms of my previous work places with the norms of CEA. I’ll quote myself from this recent comment:
“When I compare my time working in for-profit companies to my time working at CEA, it’s pretty stark how much more the people at CEA care about communicating honestly. For example, in a previous for-profit company, I was asked to obfuscate payment-related changes to prevent customers from unsubscribing, and no one around me had any objection to this.”
Perhaps more importantly, in my opinion, speaking for no one else, I think Zach in particular shares these values. In a different recent comment, I was responding to a critical article about the Forum and tried to clarify how much staff time put towards the Forum costs. I thought my original comment was open/clear about these costs, but Zach felt that it was misleading, because it did not talk about the indirect overheads that CEA pays per employee, and this could lead readers to think that our team is more cost-effective than it actually is. You can read more in my EDIT section, which I added based on his suggestion. I personally think this is an example of Zach having high integrity and being truth-seeking, and after this exchange I personally updated towards being more optimistic about his leadership of CEA. Of course you can’t judge any person on a single action, so just like in any other context, you should only think of this as one data point.
RE: “I had had vague concerns about EA for years, but had never written them up because I couldn’t get a good enough handle on it. It wouldn’t have been crisp, and I had seen too many people go insane with their why I left EA backstories. I knew there were problems but couldn’t articulate them and was in, I think, a pretty similar state to where you are now. Then I found a crisp encapsulation where I could gather data and prove my point and then explain it clearly so everyone could see it.”
I would be very interested to read a crisp encapsulation. Apologies if I missed it, but I didn’t see any specific concerns about EA overall that rise to the level of like, preventing EA from reaching its potential for improving the world, either in your transcript or in your two linked articles in the video description. (Perhaps this is a misunderstanding on my part — perhaps you are highlighting problems that you don’t see as very severe for EA overall, but you left EA because of the lack of response to your writing rather than the severity of the problems?)
The two linked articles are:
EA Vegan Advocacy is not truthseeking, and it’s everyone’s problem
This post seemed like a lot of work, and I appreciate that you did that. I skimmed it to remind myself of the contents, and I generally agree with it. I pretty strongly agree with our Forum norms which I feel have a similar spirit. Some example snippets:
Don’t mislead or manipulate.
Aim to inform, rather than persuade.
Clarity about what you believe, your reasons for believing it (this could be “I have this intuition for some reason I can’t quite track”), and what would cause you to change your mind.
Note that I am not vegan and I don’t know much about the base facts around veganism and nutrition.
I also don’t recall any time in which someone in an EA context pressured me to be vegan. The closest thing was a student at an EAGx was visibly disappointed when I told them I wasn’t vegan, which did make me feel bad.
I think veganism can get tied up in people’s identities and can come with baggage. This can make it particularly hard to be truth-seeking around. I think most topics discussed in EA contexts have significantly less baggage, so while I agree that this is concerning, I don’t view this specific example as being strong evidence of a severe broader problem.
I think it’s a very helpful flag that we should be keeping an eye on topics that have some baggage. My understanding is that the Forum’s moderation team does this to some extent, but they are relatively hands-off (for example, they are not required to read everything posted during their on-call week), and so perhaps we should revisit the responsibilities of this role and consider if they should be more hands-on. If you have thoughts about this please let me know.
From your transcript: “EA had failed in noticing these obvious lies that everyone knew”
I agree that, if this is happening, this is bad. I’ll say that I’m not a fan of this exaggerated framing, and I find it quite anti-truth-seeking. But to address the actual point: if you ask me “Sarah, how can you be sure this is not currently happening?” I would say that I certainly cannot prove that this is not happening, since I am privy to an insignificantly small percentage of communications that happen in EA contexts — like, I barely have time to read the Forum. I end up talking with a fair number of people in EA contexts for my job, and I personally haven’t noticed signs of anything in this category happening recently.
Actually, I have a specific suggestion that might address your concerns here. If you notice something in this category happening again, and you want someone to do something about it, you can reach out to me directly. For example, if you think “no one is saying this thing [on the Forum] because everyone is too scared”, well I am probably not too scared, so I can just post something about it with my account.
On the specific points, I think sometimes your bar is too high. For example, the “Ignore the arguments people are actually making” section really resonated with me, because I also find it annoying when people do that. But I think that’s just how humans work. In my experience, people are probably somewhat better at this than average in EA contexts, and people in rationalist contexts are not better at this than people in EA contexts. I certainly appreciate the reminder and encouragement to do better, and I think it’s always good to strive to do better, but IMO this is pretty intractable.
I don’t spend a ton of time on LW so perhaps those users are overall better at these things than the EA community is. However I still find your bar for the “acceptable level of truth-seeking” pretty vague, and I will say that my in-person interactions with rationalists have not impressed me[1] wrt their ability to be truth-seeking (like, they care about this more than most people, which I appreciate, but I’ve seen them make plenty of mistakes).
Truthseeking is the ground in which other principles grow
Our team actually recently curated this post on the Forum without knowing about your interview 🙂
I like and generally agree with the points you make about how to be truth-seeking, and about why that is vitally important (as does Will who curated the post).
I think it’s pretty sus for someone’s reaction to be “I already do all the things you suggest”, but I kinda want to say something in that direction (at least I want to say that I agree these are good and I strive to do them, though perhaps not in the same ways or to the same extent that you think people should). I don’t want to make this novel of a comment any longer by providing evidence of my actions, but I’m happy to dive into it if anyone is interested.
After re-reading your “EA Vegan Advocacy” post, my guess is that you don’t intend this “Truthseeking…” post to provide concrete evidence/data of your concerns about EA. Please let me know if I’m mistaken about this.
RE: “I would be delighted if that happened. But I think it gets harder to do that every year. As EA dilutes itself doubling with college students every two years.”
It depends on how you define who is “EA”, but based on recent data I have seen, the major growth was in 2022. Therefore, I think it’s broadly accurate to say that EA doubled between 2020 and 2022, but I don’t think it’s accurate to say that about any time after 2022. In particular, after FTX, growth has slowed pretty dramatically across basically anything you’d consider to be EA.
I think the claim that “the doubling is due to college students” needs evidence to support it. My understanding is that many EA groups got more resources and grew around 2022, not just university groups. And the marketing that drove a lot of the growth in 2022 was not aimed at college students, so I don’t see why, for example, the increase in EA Forum users would all be from college students.
RE: “but EA makes a lot of noise about arguing within itself and yet there’s a reason so much criticism on EA forum is anonymous And the criticism on LessWrong is all done under people’s stable pseudonym or real name.”
There are probably many reasons for this, and I think one is that there are less power structures within the rationalist community (I don’t think there is the equivalent of OP — maybe Eliezer?). The EA community has more major funders. I think people in the EA community also tend to be more risk-averse than LWers, so they are more likely to default to writing anonymously. And I believe that culturally, LW socially rewards blunt criticism more than the EA Forum does, so there is extra incentive to do that since it’s more likely to be good for your social status. On the other hand, my impression is that the EA community is more welcoming of criticizing people/orgs in power (in the community, not in the world) than LW is, so it’s possible that criticism on the EA Forum has more teeth. For example, there is a fair amount of criticism of OP and CEA that is well-received on the Forum, and I don’t know of a similar situation for Eliezer and Lightcone on LW (I’m less familiar with LW so definitely feel free to correct me if I’m mistaken here). So in some sense, I think more anonymity is a result of the fact that holding power in EA is more consequential, and it is more socially acceptable to criticize those with actual power in the community. To be clear, I’m not trying to encourage people to be scared of posting criticism with their real name, and I don’t know of any specific cases where someone was harmed from doing that, but I just think that it’s reasonable for a person who doesn’t know the consequences of posting criticism to default to doing so anonymously.
In my opinion, it’s not clear that, in an ideal world, all criticism would be written using someone’s real name. I feel pretty uncertain about this, but it seems to me that we should be supportive of people sharing criticism even if they have personal reasons for staying anonymous. Personally, I really appreciate receiving feedback in general, and I would prefer someone give it to me anonymously than not at all.
RE: “E: Which EA leaders do you most resonate with? T: …it’s a list of friends…most of my friends don’t actually want to be talked about in public. I think it speaks against the health of the EA movement right now”
I agree with this, which is why I feel optimistic about CEA putting significant resources towards EA communications work. My guess is that this will be some combination of, being more open and publicly communicating about CEA itself, and putting more effort into proactively writing about what EA is and what people in the community are doing, for a broader audience.
RE: “I would suggest that if you don’t care about the movement leaders who have any steering power. You’re, you’re not in that movement.“
I will say that I identified as an (aspiring) effective altruist for many years before I could name or identify any EA leaders (I only started learning names when I was hired at CEA). I simply found the core principles very compelling, and occasionally read some relevant articles, and donated money via GiveWell and some other places. You could argue that I wasn’t “in the movement”, but I do keep my identity small and the principles resonated with me enough to add that identity.
RE: “ea would be better served by the by having leadership that actually was willing to own their power more”I would say that under Zach, CEA as an institution is taking more of a leadership role than it previously had been (which was basically none), and people within CEA are more empowered to “own our power” (for example, the Forum team will likely do more steering than before, which again was minimal). EDIT: Based on responses, I’m a bit worried that this is misleading. So I will disconnect this point from the actual quote, and add some clarifications.
RE: “if you’re going to have a big enough movement, you actually want leaders who are. Not just leading in their shadow but are like hey, I feel called to lead which means like i’m gonna like Be the scapegoat and i’m gonna be the one like making big level observations and if you come to me with a problem I’m gonna try to align the integrity of our entire community around The truth.”
I do expect this means that CEA will be the scapegoat more often. I also expect CEA will put more resources towards high level observations about the EA community (for example, I referenced data about EA growth earlier, which came from a CEA project). I’m less sure about that last point, because EA is ultimately a framework (like the scientific method is), and we can’t be sure how many people are out there inspired by EA principles but do not, like, read the EA Forum. I guess you could just define “the entire community” as “people who read the EA Forum” to circumvent that issue. In which case, I expect CEA to do more of that going forward.
Concluding thoughts
I understand that this is an emotional topic for you; however, I was surprised at the amount of anti-truth-seeking I saw in your responses. (My bar for you is perhaps unreasonably high because you write about truth-seeking a lot, and I assume you hold yourself to the standards that you preach.) For example, I think the state of EA university group recruiting is a key factor in your beliefs, but it also seems like you do not have up-to-date information (nor did you attempt to find that information before stating your internal view as if it was fact). You often use exaggerated language (for example, “EA had failed in noticing these obvious lies that everyone knew”, which is certainly not literally true), which I think is actually quite harmful for truth-seeking. Outside of the “EA Vegan Advocacy” post, I see surprisingly few instances of you publicly thinking through ways in which you might be wrong, or even gesturing towards the possibility that you might be wrong, or at least hinting at what your confidence levels are. I genuinely want to engage with your concerns about EA, but I feel like this post (even together with your two posts linked above) is not epistemically legible[2] enough for me to do that. I can’t find a clear core claim to grasp onto.
“Integrity” is a concept that comes up a lot in the interview. I haven’t really addressed it in my comment so I figured I should do so here. Personally I have some complicated/unresolved feelings about what integrity actually means[3], so I don’t know to what extent I have it. I’m happy to dive into that if anyone is interested. If you want to test me and tell me objectively how much integrity I have, I’m open to that — that sounds like it would be helpful for me to know as well. :)
To close, I’ll just caveat that I spent a long time writing this comment, but because I wrote about so many things, I wouldn’t be surprised if I said something that’s wrong, or if I misunderstood something that was said, or if I change my mind about something upon further reflection. I’m generally happy to receive feedback, clarify anything I said that was unclear, and discuss these issues further. Specifically, I have significant influence over the EA Forum, so I would be particularly interested to discuss issues and improvements focused on that project.
Meaning, based on rationalist writings, I had higher expectations, so I was disappointed to find they did not meet those expectations.
I had forgotten that you were the person who coined this term — thank you for that, I find it very helpful!
For example, I think people sometimes mix up the concept of “having integrity” with the concept of “acting in the same way that I would” or even “acting in a way that I would find reasonable”, but my understanding is that they are distinct. I’m quite unsure about this though so I could certainly be wrong!
There’s a lot here and if my existing writing didn’t answer your questions, I’m not optimistic another comment will help[1]. Instead, how about we find something to bet on? It’s difficult to identify something both cruxy and measurable, but here are two ideas:
I see a pattern of:
1. CEA takes some action with the best of intentions
2. It takes a few years for the toll to come out, but eventually there’s a negative consensus on it.
3. A representative of CEA agrees the negative consensus is deserved, but since it occurred under old leadership, doesn’t think anyone should draw conclusions about new leadership from it.
4. CEA announces new program with the best of intentions.
So I would bet that within 3 years, a CEA representative will repudiate a major project occurring under Zach’s watch.
I would also bet on more posts similar to Bad Omens in Current Community Building or University Groups Need Fixing coming out in a few years, talking about 2024 recruiting.
Although you might like Change my mind: Veganism entails trade-offs, and health is one of the axes (the predecessor to EA Vegan Advocacy is not Truthseeking) and Truthseeking when your disagreements lie in moral philosophy and Love, Reverence, and Life (dialogues with a vegan commenter on the same post)
Thanks! I’m down to bet, though I don’t feel like it would make sense for me to take either of those specific bets. I feel pretty clueless about whether “a CEA representative will repudiate a major project occurring under Zach’s watch”. I guess I think it’s reasonable for someone who was just hired at CEA to not to be held personally responsible for projects that started and ended before they were hired (though I may be misunderstanding your proposed bet). I also have very little information about the current state of EA university group recruiting, so I wouldn’t be that surprised if “more posts similar to Bad Omens in Current Community Building or University Groups Need Fixing coming out in a few years, talking about 2024 recruiting”. TBH I’m still not clear on what we disagree about, or even whether we actually disagree about anything. 😅
Apologies if I wasn’t clear about this, but my main comment was primarily a summary of my personal perspective, which is based on a tiny fraction of all the relevant information. I’m very open to the possibility that, for example, EA university group recruiting is pressuring students more than I would find appropriate. It’s just that, based on the tiny fraction of information I have, I see no evidence of that and only see evidence of the opposite. I would be really interested to hear if you have done a recent investigation and have evidence to support your claims, because you would have a fair chance of convincing me to take some action.
Anyway, I appreciate you responding and no worries if you want to drop this. :) My offer to chat synchronously still stands, if you’re ever interested. Though since I’m in an interim position, I’m not sure how long I will have the “EA Forum Project Lead” title.
I think this would be a mistake (or more likely I think you and Elizabeth mean different things here.)
As you mention in other parts of your comment, most people who consider themselves aligned with EA don’t know or care much about CEA, and coupling their alignment with EA as principles with an alignment with CEA as an organization seems counterproductive.
Ah interesting, yeah it’s certainly possible that I misunderstood Elizabeth here. Apologies if that’s the case!
I’ll try to explain what I mean more, since I’m not sure I understand how my interpretation differs from Elizabeth’s original intent. So in the past, CEA’s general stance was one more like “providing services” to help people in the EA community improve the world. Under Zach, we are shifting in the direction of “stewardship of EA”. I feel that this implies CEA should be more proactive and take more responsibility for the trajectory of EA than it has in the past (to be clear, I don’t think this means we should try to be the sole leader, or give people orders, or be the only voice speaking for EA). One concrete example is about how much steering the Forum team does: in the past, I would have been more hesitant to steer discussions on the Forum, but now it feels more appropriate (and perhaps even necessary) for the Forum team to be more opinionated and steer discussions in that space.
Sorry, I don’t feel like I understand this point — could you expand on this, or rephrase?
As a personal example, I feel really aligned with EA principles[1], I feel much less sure about CEA as an organization.[2]
If the frame becomes “EA is what CEA does”, you would lose a lot of the value of the term “EA”, and I think very few people would find it useful.
See why effective altruism is always lowercase, and William MacAskill “effective altruism is not a package of particular views.”
My understanding is that you agree with me, while Elizabeth would want effective altruism to be uppercase in a sense, with a package of particular views that she can clearly agree or disagree with, and an EA Leader that says “this is EA” and “this is not EA.” (Apologies if I misunderstood your views)
“CEA as an institution is taking more of a leadership role” could be interpreted as saying that CEA is now more empowered to be the “EA Leader” that decides what is EA, but I think that’s not what you mean from the rest of your comment.
Does that make sense?
For me EA principles are these ones:
I think these are principles that most people disagree with, and most people are importantly wrong.
I think they are directionally importantly right in my particular social context (while of course they could be dangerous in other theoretical contexts)
Despite thinking that all people I’ve interacted with who work there greately care about those same principles.
Seeing my statements reflected back is helpful, thank you.
I think Effective Altruism is upper case and has been for a long time, in part because it aggressively recruited people who wanted to follow[1]. In my ideal world it both has better leadership and needs less of it, because members are less dependent.
I think rationality does a decent job here. There are strong leaders of individual fiefdoms, and networks of respect and trust, but it’s much more federated.
Which is noble and should be respected- the world needs more followers than leaders. But if you actively recruit them, you need to take responsibility for providing leadership.
Thanks, that’s very helpful! Yeah I believe you’ve correctly described my views. To me, EA is defined by the principles. I’ll update my original comment, since now it seems that bit is misleading.
(I still think there is something there that gestures in the direction that Elizabeth is going. When I say “CEA is taking more of a leadership role”, I simply mean that literally — like, previously CEA was not viewing itself as being in a leadership role, and now it is doing that a non-zero amount. I think it matters that someone views themselves as even slightly responsible for the trajectory of EA, and you can’t really be responsible without wielding some power. So that’s how I read the “willing to own their power more” quote.)