EDIT: THIS IS NOT APRIL FOOLS RELATED
ALSO: This is specific to the LW scene in Berkeley and nearby Berkeley, as this is the only place where e/acc exclusion is asserted to take place.
I haven’t been around the LW scene for some time, but I understand it’s common to exclude e/acc people from events. I further understand this to be exclusion on philosophical grounds, not just because LW-ites tend to view e/acc people individually as unlikeable.
I personally don’t want to try to sneak into LW parties if I’m someone that the hosts are trying to exclude on philosophical grounds. So I’d rather clarify whether, in the opinion of various people, I count.
It’s common among e/acc people to say things like “We’re so close, just don’t die” by which they mean that AGI is close. They also want to create AGI as soon as possible. By contrast, LW-ites typically believe that AGI is close, and therefore it is necessary to slow down or stop AGI development as soon as possible, in order to ensure that future development is done safely.
I part ways from both camps in believing that we’re nowhere close to AGI, that the apparently-impressive results from LLMs are highly overrated, and that the X-risk from AI is 0 for the forseeable future. If I didn’t think this, I would be sympathetic[1] to the desire to stop AI until we thought we could do it safely. But I do think this, so AI safety seems like a Victorian Nuclear Regulatory Commission. The NRC is a good thing, but it’s going to be a while before splitting the atom is even on the table.
As a result, in practice I think I’m functionally e/acc because I don’t want to stop the e/acc people from trying to push AGI as fast as possible. I don’t think they’re actually an X-risk since they’re not going to succeed any time soon. But I’m theoretically decel because if I thought anyone was anywhere close to AGI I’d be sympathetic to efforts to restrain it. As it is, I think the AI safety people can continue to study AI safety for years confident that they can finish all the theories off long before they actually become necessary for survival.
In light of that, if you’re the sort of person who wants to exclude e/acc people from your party, should I just not show up? That’s fine with me, I’d just as soon know ahead of time.
Actually, the fact that I have to even ask this question makes me disinclined to show up anyway, but I’m sort of curious what people would say.
- ↩︎
“Sympathetic” does not necessarily mean “in favor of.” It’s a practical question whether various strategies for controlling AI development are feasible or worth their risks. If you have to risk nuclear war to ensure the other players don’t cheat, it might not be worth it. Thus I’m not comfortable saying in the abstract “I’m in favor of measures to control AI development” given that I’m not sure what those measures are.
I think the desire to exclude e/accs is mainly because of their attitude that human extinction is acceptable or even desirable, not because of the specifics of what regulatory actions they support. So how do you feel about human extinction?
I described my feelings about human extinction elsewhere.
However, unlike the median commenter on this topic, you seem to grant that e/acc exclusion is actually a real thing that actually happens. That is
is a strange thing to say if there was not, in fact, an actual desire among LW party hosts in Berkeley. So inasmuch as my doubts about the truth of this have been raised by other respondents, would you mind clarifying
If you do in fact believe that e/acc exclusion from LW parties is a real phenomenon.
What kind of experience this is based on.
I’m not in Berkeley and I have no direct knowledge of Berkeley parties, but a certain level of contempt or revulsion toward e/acc seems pretty universal among the LW-aligned people I know. I have no reason to doubt that there’s no explicit rule against e/accs showing up at Berkeley parties, as others have said. I personally wouldn’t feel entirely comfortable at a party with a lot of e/accs.
Your view is compatible with the ideology of e/acc. Dunno about house parties, I probably wouldn’t be invited, but:
https://www.lesswrong.com/posts/mmYFF4dyi8Kg6pWGC/contra-ngo-et-al-every-every-bay-area-house-party-bay-area
So the partygoers invited a Rabbi and seem to be self aware enough to admit that their own organization is reasonably defined as a cult. Sounds like you could score an invite if you are the kind of person that gets invited to other parties a lot.
Evidence on ideology: https://thezvi.substack.com/p/based-beff-jezos-and-the-accelerationists
@Zvi gives a list here, matching reasons bolded:
Thanks, this is extremely helpful. Having a clearer definition of how e/acc is understood to LW makes this much easier to think about.
Just for fun, I’ll quibble: I would add to my list of e/acc heresies
Insofar as I genuinely believe that to some extent, various actors are trying to take advantage of sincerely-held beliefs by LWers in the importance of decel-until-alignment to craft rules which benefit them and their short-term interests in money and power. This is bad, but also people do this sort of thing in our society all the time so you need to have perspective and recognize that it’s not the literal end of the world. I don’t know if I would say it’s the thing we need to worry about, but it’s more likely to cause harm now, whereas AGI is not.
I’d say it was an acceptable risk, and one that we’re running anyway. It’s reasonable to increase the risk slightly in the short run to reduce it in the long run. Is there an outcome with human extinction which I would also consider good? That’s kind of hard to say. Like I think Neanderthal extinction was an acceptable outcome. So clearly “all humans are extinct and now there are only posthumans” is acceptable, for some values of posthuman. I dunno, It’s all extremely academic and taking it too seriously feels silly.
Also
I think you misunderstood what I was getting at. The reason I object to a Victorian NRC is not that I want to avoid decelerating atomic physics (I don’t even know if I ought to expect that). I object because it’s quixotic. Or just plain silly. There are no nuclear reactors! What are you people in HMNRC even doing all day? Theorycrafting reporting standards for SCRAM incidents? How sure are you that you actually, you know, need to do that?