Eventually, my Rabbi friend said “Okay, so what I’m hearing is: you’re expected to tithe 10% of your earnings to charity, you have pilgrimages a few times a year to something called EA Global, and you believe a superhuman power will usher in a new era in which humanity undergoes substantial transformations resulting in either total destruction or universal peace.” Heads nodded. “This… sounds a little like a cult,” he said. “Yes!!” multiple people excitedly shouted at once.
So the partygoers invited a Rabbi and seem to be self aware enough to admit that their own organization is reasonably defined as a cult. Sounds like you could score an invite if you are the kind of person that gets invited to other parties a lot.
Those like Beff Jezos, who think human extinction is an acceptable outcome.
Those who think that technology always works out for the best, that superintelligence will therefore be good for humans.
Those who do not believe actually in the reality of a future AGI or ASI, so all we are doing is building cool tools that provide mundane utility, let’s do that.
Related to previous: Those who think that the wrong human having power over other humans is the thing we need to worry about.
More specifically: Those who think that any alternative to ultimately building AGI/ASI means a tyranny or dystopia, or is impossible, so they’d rather build as fast as possible and hope for the best.
Or: Those who think that even any attempt to steer or slow such building, or sometimes even any regulatory restrictions on building AI at all, would constitute a tyranny or dystopia so bad it is instead that any alternative path is better.
Or: They simply don’t think smarter than human, more capable than human intelligences would perhaps be the ones holding the power, the humans would stay in control, so what matters is which humans that is.
Those who think that the alternative is stagnation and decline, so even some chance of success justifies going fast.
Those who think AGI or ASI is not close, so let’s worry about that later.
Those who want to, within their cultural context, side with power.
Those who don’t believe they like being an edge lord on Twitter.
Those who personally want to live forever, and see this as their shot.
Those deciding based on vibes and priors, that tech is good, regulation bad. (at least a Victorian NRC would be bad since they would decel the things that eventually made nuclear reactors possible)
The degree of reasonableness varies greatly between these positions.
Thanks, this is extremely helpful. Having a clearer definition of how e/acc is understood to LW makes this much easier to think about.
Just for fun, I’ll quibble: I would add to my list of e/acc heresies
Related to previous: Those who think that the wrong human having power over other humans is the thing we need to worry about.
Insofar as I genuinely believe that to some extent, various actors are trying to take advantage of sincerely-held beliefs by LWers in the importance of decel-until-alignment to craft rules which benefit them and their short-term interests in money and power. This is bad, but also people do this sort of thing in our society all the time so you need to have perspective and recognize that it’s not the literal end of the world. I don’t know if I would say it’s the thing we need to worry about, but it’s more likely to cause harm now, whereas AGI is not.
Those like Beff Jezos, who think human extinction is an acceptable outcome.
I’d say it was an acceptable risk, and one that we’re running anyway. It’s reasonable to increase the risk slightly in the short run to reduce it in the long run. Is there an outcome with human extinction which I would also consider good? That’s kind of hard to say. Like I think Neanderthal extinction was an acceptable outcome. So clearly “all humans are extinct and now there are only posthumans” is acceptable, for some values of posthuman. I dunno, It’s all extremely academic and taking it too seriously feels silly.
Also
at least a Victorian NRC would be bad since they would decel the things that eventually made nuclear reactors possible
I think you misunderstood what I was getting at. The reason I object to a Victorian NRC is not that I want to avoid decelerating atomic physics (I don’t even know if I ought to expect that). I object because it’s quixotic. Or just plain silly. There are no nuclear reactors! What are you people in HMNRC even doing all day? Theorycrafting reporting standards for SCRAM incidents? How sure are you that you actually, you know, need to do that?
Your view is compatible with the ideology of e/acc. Dunno about house parties, I probably wouldn’t be invited, but:
https://www.lesswrong.com/posts/mmYFF4dyi8Kg6pWGC/contra-ngo-et-al-every-every-bay-area-house-party-bay-area
So the partygoers invited a Rabbi and seem to be self aware enough to admit that their own organization is reasonably defined as a cult. Sounds like you could score an invite if you are the kind of person that gets invited to other parties a lot.
Evidence on ideology: https://thezvi.substack.com/p/based-beff-jezos-and-the-accelerationists
@Zvi gives a list here, matching reasons bolded:
Thanks, this is extremely helpful. Having a clearer definition of how e/acc is understood to LW makes this much easier to think about.
Just for fun, I’ll quibble: I would add to my list of e/acc heresies
Insofar as I genuinely believe that to some extent, various actors are trying to take advantage of sincerely-held beliefs by LWers in the importance of decel-until-alignment to craft rules which benefit them and their short-term interests in money and power. This is bad, but also people do this sort of thing in our society all the time so you need to have perspective and recognize that it’s not the literal end of the world. I don’t know if I would say it’s the thing we need to worry about, but it’s more likely to cause harm now, whereas AGI is not.
I’d say it was an acceptable risk, and one that we’re running anyway. It’s reasonable to increase the risk slightly in the short run to reduce it in the long run. Is there an outcome with human extinction which I would also consider good? That’s kind of hard to say. Like I think Neanderthal extinction was an acceptable outcome. So clearly “all humans are extinct and now there are only posthumans” is acceptable, for some values of posthuman. I dunno, It’s all extremely academic and taking it too seriously feels silly.
Also
I think you misunderstood what I was getting at. The reason I object to a Victorian NRC is not that I want to avoid decelerating atomic physics (I don’t even know if I ought to expect that). I object because it’s quixotic. Or just plain silly. There are no nuclear reactors! What are you people in HMNRC even doing all day? Theorycrafting reporting standards for SCRAM incidents? How sure are you that you actually, you know, need to do that?