Something about Status Ladders in Berkeley
This is a quick post about some experiences with the Berkeley rationalist community. I haven’t had great experiences talking about this in person to folks when I lived there for two months. The perceptive reader may be able to discern who I am, though I would respectfully like to remain anonymous. It is not clear that posting on LesserWrong is the right way to communicate such experiences, though if it’s not just let me know and I’ll update my content to better reflect what the site is looking for.
Preface: My experiences have been largely positive, but there were a few worrying signs. This came from spending about two months living in Berkeley and working closely with the Rationalist community, around the MIRI or CFAR office. I do not claim to speak for any organization or person except myself. The time frame was roughly February to March of 2017.
Background: I have been to a CFAR workshop, and have volunteered at two. Also, 1⁄4 of the details I’ve given about myself are falsified to prevent identity-detection. I really loved the environment of the CFAR workshop. I’ve also participated in 1-3 of CFAR’s workshops, including but not limited to the CFAR workshop for machine learners, some workshops on AI safety, MIRI workshops, and the like. I have also worked on one or two smaller Rationalist projects.
Immediate thoughts: Honestly, spending lots of time in the community was scary.
Immediate thoughts part 2: I was worried about what I saw as the “status ladders” among the people who I socialized with or hung around. It seemed like there was a very clear pecking order, where a couple folks I interacted with who claimed to be above such status games quite evidently practiced them regularly.
I would have liked to believe this was only a small subset of the Rationalists, and that most people weren’t like this. This may be the case. However, I am quite confident that said Rationalists are or were considered high-status in the community, whether people admit this or otherwise.
Sorry about the lack of detail and clarity here; I’m attempting to balance between my own anonymity, the anonymity of people I refer to, and providing enough detail to be informative.
One particular case study: when I saw some practice or some folks who gave me the willies, it seemed like whoever I reported to would say something like: “Why do you think that?”
while they made it clear with their body language, and their verbal language, that I was incorrect and needed to be diagnosed. That whole dynamic left a rather bad taste in my mouth, to be quite honest. This response replicated three or four times, and ran the gamut from people who hung around the MIRI office, to CFAR instructors and people universally considered high status by this community. This assumption of my “clear” wrongness was especially prominent when I noted that someone high status in the community gave me bad vibes. Admittedly, an outsider (me) criticizing a valued community member is not considered tasteful, anywhere.
Note: I know that “why do you think that?” or variants, is something popular to say in the Rationalist community. Usually it bespeaks honest curiosity. This time, I am fairly certain it contained far more assumption of wrongness than genuine curiosity.
______________________________
In France, the far-right has a saying: “On est chez nous.” It means: “We are at home.” It is usually uttered by total racists who fear that France’s immigration model (where people are expected to avoid public displays of their culture and assimilate into France) is being broken by the wave of immigration from Northern Africa and Syria.
However, it resonates with me: being at home. Somehow, the aesthetic within the Rationalist community at Berkeley turned this one young human away from it, and I think that’s a little bit sad—I’m not the only one, I am very sure. I still think people are up to good things, though I hesitate to return to the space. The feeling at home was not present, and I fear I am not the only one to turn away from the space because of this.
I hope I am not making an implicit accusation against all the self-identifying Rationalists, or high status members of the community. I’m only applying a weakly descriptive label to avoid the accusation or assumption that I have constantly received in the past. This assumption is that I must be referring to people “outside the core Rationalist group”. AKA that I am referring to the “barbarians”, not the “civilized folk”. For those who are active members of the Rationalist community who would never do what I described, my post does not refer to you (though honestly, some of the people it does refer to would not think it refers to them. It’s always someone else :) )
*
Negative bias applies, and it appears that I have been hanging out with a skewed and non-representative set of Rationalists. In the past, this has led people to write off my experiences as being non-representative. I think that would be a mistake, and it feels rude when people do so.
I was wondering what community thoughts around this are.
I think it would be quite valuable if you could talk to Elizabeth Garrett or Julia Wise, who are respectively the community managers and conflict mediators for the rationality and EA communities. Especially if you have had bad experiences with specific individuals, it is very important to have one person who is sworn to confidentiality who can aggregate the information that something bad is happening from multiple people, and take action if necessary. I am happy to put you into contact with either of them and keep your identity completely secret. Easiest is probably to message me on FB or on Discord, since the PM system on here is still a bit broken.
I am obviously also really sorry for the experiences you’ve had and would also be happy to chat about them directly. I care a lot about the health of this community and think status dynamics like this can both be really damaging and require a lot of ingenuity to solve.
Maybe I’m cynical, but this is a strange sentiment to have beacuse the public parts of the Bay Area rationalist community (I personally opine) are made of status ladders.
I don’t think I know what you mean by this. Status ladders definitely play a large role in the Bay Area, as they do in almost any community, but there are arrangements of status that cause people to feel gaslit and marginalized who would otherwise have a potential to contribute valuable resources to a community, and there are arrangements that cause them to feel positively encouraged, safe and understood, even if they don’t turn out to be a good fit for the community.
The Bay Area rationality community does some things in this space very well, especially compared to the broader world, but this post is evidence that it does some things less well. Especially on the honesty level where the OP seems have gotten a sense that at least a subset of the people in the Bay Area do not have enough self awareness of the status dynamics that are going on, or might be semi-consciously deceiving people by denying that they exist, while at the same time taking advantage of them.
I am not in the camp of people who says that you should not have status hierarchies in a community. It’s hard to coordinate people, and status based thinking is so deeply embedded in people’s thinking that I expect it will inevitably play a part in any community. However, you can drastically change what things are assigned status, how transparent the system is, and to what degree the community is aware of the role that it plays. I am interested in improving those.
Also, since this is demon-thread material, I might disengage if I find myself getting too emotionally involved.
Thank you for the comment. I know Elizabeth somewhat well, and I’m really happy that she became CFAR’s community manager :). Her position came after my time in the physical community. Perhaps I’ll feel more at home at Berkeley nowadays.
Unfortunately, since this post is (intentionally) so vague, it’s hard to figure out what to make of it. Can you clarify any more on “bad vibes”?
The vibe was: person felt kind of “rapey” (and I mean this in the most serious sense). It is unclear whether this vibe was presenting more than pure noise.
Fwiw I didn’t think ChristianKL’s comment was in bad faith, but I could see how it could be taken as such. Agree with the decision to shut down that thread though. There was some kindling ready for a flame war there :).
While this comment seems positive and good to me, I think it’s important to have a policy of not commenting on closed-down threads anywhere but meta. So I am closing down comments here as well, and am dutifully giving you a warning to not do that again.
I don’t mean to imply that whatever Berkeley is doing is wrong. I don’t mean to imply that others are to blame for my feelings. Yet, for the sake of data: I didn’t feel welcome either. This may be:
My own imposter syndrome, being intimidated by the intelligence of people around me. In this case, so be it. Maybe some rationalists can have a go at debugging the syndrome for the community.
Berkelean tribal instincts that either go unnoticed or are reflexively endorsed, in which case it may be useful for Berkeley people to evaluate their policy
A (Semi) conscious policy, in which Berkeley is kinda trying to keep their numbers down or their average person quality up. There is something to say for this policy, though I would prefer that Berkeley people are honest about it
In any case, I’d like to see more data. If this is a common thing, we should address it some way.
I hadn’t thought about it being perceived this way. As perhaps with many communities, it sometimes happens that there are more people than can be accomodated in an event or who can fit within the active social network of a person (cf. Dunbar number). As a result if you are running an event or just trying to keep tabs on your friends (your tribe) you only have room for so many and have to make a cut somewhere (not necessarily consciously but somewhere people will fade out of your perview). This unfortunately means it’s not as easy to “get in” with the Berkeley crowd because many events and people are already at their limits so a new person coming in requires an existing person going out.
Now there’s plenty of natural churn—people move, interests change, etc.—and this offers opportunities for new people to come in without displacing anyone, but at the margins there is definitely going to be some competition to stay in the tribe. Like, if you’re the 100th person I think of you’re at more risk of not getting an invite than if you are the 10th person I think of, and being 100th you are at more risk of being forgotten because I recently met someone new or just haven’t talked to you in a while. And if you find yourself feeling you are on the edges of the tribe it can be distressing to have to work to stay close enough to the fire to remain warm.
Anyway this is all to say that the Berkeley rationalists are of such a size that they naturally exhibit behavior patterns matching those of a human tribe. I don’t know if I would call this a “policy” though, and certainly many rationalists, being humans, are unaware they are engaged in these social dynamics such that they might say hypocritical things to signal status, membership, etc..
Perhaps the good news is that there’s a natural counterbalance to this that I already see happening: tribal split. That is, we’re big enough in Berkeley and the Bay Area that I feel like we’re developing at least 2 if not 3 tribes. The details are still fuzzy because we’re not quite big enough to force a solid split and I expect there to always be plenty of cross-over because we are all part of the same clan (I realize now I have my use of tribe and clan reversed...), but this is naturally what will happen if we grow as a community.
Like with other small, tight-nit groups, rationalists will be welcome anywhere rationalists congregate, but only so many folks can be members of a particular congregation at the same time.
“I feel like we’re developing at least 2 if not 3 tribes”—how do you see this happening? And are you talking about a LW/EA divide or a location divide, ect.
Hmm, I’m not quite sure what the attractors are. It’s not purely location, although there is probably a correction between distance from the Bailey on Ward street and feeling like you’re with the in crowd. EA is correlated with which people you are likely to spend time with, but that’s not quite it either. There’s also maybe some amount of self-selection around social norms between what we might call nerds (people observing nonstandard social norms or just not observing social norms) and normies (people observing sufficiently standard social norms that they can and do have non-nerd, non-rationalist friends). But there’s nothing so strong on its own that we’ve got clear divisions yet, just some early clustering around things like EA/non-EA and nerd/normie.
What’s your guess for the ratios involved? (of EA/non-EA and nerd/normie)
Hi there—posts about the rationality community, or posts only of interest to those in the connected social scene, are not for the frontpage, so I’ve moved this back to your personal LW blog. (You can also find this post in the community filter.)
Thank you! I apologize about the confusion. Duly noted for the future!
No worries :-)
Also, I’d only skimmed it when I moved it back to your personal blog. Rereading I notice you said
Yeah, so the personal blogs are for anyone to write what they please. In general I’m happy to hear about the sorts of info that I’m biased not to hear about (like this). For example, I just gave someone low-confidence, strongly negative feedback on a significant project in large part because I expected them to not get that sort of feedback where it existed (and also because I believed them to care about the truth of the matter).
It’s hard to know how much to trust your impressions from the lack of details and your anonymity, but sometimes the data you get is just going to be like that shrugs. My initial guess is that, while it’s hard to update a great deal on a single data point, it is valable so that patterns can emerge over time. The post does feel genuinely written in good faith, so I’ll keep it as a data point—thanks. Regardless of what community I might be in, it’s good to have little nudges about the sorts of biases I’m naturally subject to regarding overly trusting those who are high status.
This post is not only of interest to those in Berkeley. If I was in Berkeley I wouldn’t have much use for reading about what people are doing IRL since I’d be living it instead of feeling left out. Given that this information informs the sorts of things people write about online but is hard to get for those of us reading from afar, I’d consider it comparatively high value compared to the usual frontpage material. People’s lives shape what they write, and so I’d consider something like this less meta than a post that tried and struggled to distill some general principle from events.
First off, I have nothing to do with Berkeley, and I’m just a newcomer to the community.
But your post leaves me with multiple questions:
First off, what are you trying to achieve? What I could guess: try to attract the attention of the Berkeley people in another fashion, after talking to some of them in the flesh already. Another option: warn the rational community at large that this kind of thing can happen and attention must be paid.
But what kind of things? That’s really what I am missing from this story. Someone was “giving you bad vibes” and “their practices was giving you the jeebies”. Could you not expand on that without deanonymizing anyone? I mean, the people you’ve complained to already know who you are, so this is not a concern. If the “bad practice” will deanonymize the person, then maybe it’s not a concern to them that this thing stay hidden (just don’t name them).
As for body language, I can imagine the following scenario: if someone told me that a friend was bad along some axis and that I had never noticed that personally, it’s likely my body would instinctively react in the same fashion. Nevertheless it doesn’t mean I would dismiss you out of hand after hearing you out.
How far have you taken these discussions? After making the nature of the problem clear, what was these people’s response? Your post makes it seem as though, feeling the problem wasn’t taken seriously, you didn’t press the issue.
The first point is a good question. The goal is personal: feeling out whether such thoughts find a place in the LW space, and if so, engaging with this community more :). Altenately, it’s a reflection/”blog post”, whose partial intention is to throw some thoughts out there and hear alternative perspectives that may cause me to update.
The second point: yeah, that’s a great point and sadly I don’t know how to elaborate effectively. I’ll think on it and update this comment if that changes. Thanks for the point on body language—that’s a good one.
As for discussions, I once had a fairly long 3-hour one with some pillars of the community on this, which didn’t feel like it resolved anything. It was nice that the discussion happened at all, though.
Seems potentially very important, but difficult to judge for someone like me who doesn’t live in Berkeley. I could try imagining what exactly you meant, but that would be more about my imagination that about the things you actually complain about.
I was thinking about a possible solution, but Habryka already said it: talk with people who are there, and who happen to be good at communicating.
It’s clear that you were uncomfortable sharing this feeling with LessWrong 2.0, so I just want to applaud you for your bravery and to encourage you to write more in the future, so you know that at least one person will react positively :).
Never been to Bay Area, so the only rationalists I met were on EA events or LW meetups. I get bad vibes you described from one in every two self-described rationalists on average. I don’t get these vibes from Marxists, Social Justice activists or fundamentalist Christians, so something seems very wrong. I haven’t conceptualized my fuzzy perception yet, but absolutely alien social dynamics that many rationalists exhibit both on the internet and in real life makes me vaguely uncomfortable to a point, that I don’t feel like there’s any point in participating in rationality activities, networking with rationalists and even identifying as a rationalist.
You didn’t feel ‘at home’ in Berkeley? This is how I felt about LessWrong 2.0 when it started.
Note, though, that I have none of those issues with people, who identify as primarily Effective Altruists. EAs seem, on the contrary, nicer and warmer than the average person.
Different people can feel “uncomfortable” as a reaction to different things. The fact that both you and the author get uncomfortable feelings, doesn’t necessarily mean you refer to the same thing. (Also, the typical uncomfortable things at LW meetups may differ from typical uncomfortable things in Berkeley.)
Please, don’t get me wrong: I think that it is important for people who feel uncomfortable to notice their feelings, and announcing it may be the right thing to do. I am just saying that if we don’t talk specifically enough, we might get an illusion of talking about the same thing, while actually thinking about different things.
I AM SUMMONED.
Can we please start taking this seriously?
I suspect you might get more traction on comments like these if you provided examples, and/or reasons for taking this more seriously. As it stands, it’s not clear to me why I should be taking it more seriously other than taking your word for it [and I say this as someone who wrote the section you just commented on :)].
I have never been personally in the bay area but to me your post feels like you are reacting very much to the vibes you get to people while a lot of people in the community focus on facts and not on communicating in a way that produces good vibes.
You either don’t understand the post, or don’t understand people. Vibes are not terminal goals to optimize for, they are side-effects of something deeper. If you get bad vibes from a person, it means they are unwittingly communicating bad intentions or personality flaws. This has nothing to with social skills: a person can focus solely on facts, being abrasive and unpleasant to talk to, but still not give bad vibes.
Also, please note that you are at −5 votes now. There is this failure mode, most commonly shared by online atheist posters, when they post something technically correct, but completely irrelevant and in bad faith towards the interlocutor. You’re being deliberately obtuse, if you imply that OP is some kind of anti-fact person, in contrast to rationalists, who are very pro-fact.
Moderator Note: I am shutting down replies to this thread, since I don’t expect this to go anywhere valuable, and most likely just explode badly (Demon Threads and all that)