Thanks, this is extremely helpful. Having a clearer definition of how e/acc is understood to LW makes this much easier to think about.
Just for fun, I’ll quibble: I would add to my list of e/acc heresies
Related to previous: Those who think that the wrong human having power over other humans is the thing we need to worry about.
Insofar as I genuinely believe that to some extent, various actors are trying to take advantage of sincerely-held beliefs by LWers in the importance of decel-until-alignment to craft rules which benefit them and their short-term interests in money and power. This is bad, but also people do this sort of thing in our society all the time so you need to have perspective and recognize that it’s not the literal end of the world. I don’t know if I would say it’s the thing we need to worry about, but it’s more likely to cause harm now, whereas AGI is not.
Those like Beff Jezos, who think human extinction is an acceptable outcome.
I’d say it was an acceptable risk, and one that we’re running anyway. It’s reasonable to increase the risk slightly in the short run to reduce it in the long run. Is there an outcome with human extinction which I would also consider good? That’s kind of hard to say. Like I think Neanderthal extinction was an acceptable outcome. So clearly “all humans are extinct and now there are only posthumans” is acceptable, for some values of posthuman. I dunno, It’s all extremely academic and taking it too seriously feels silly.
Also
at least a Victorian NRC would be bad since they would decel the things that eventually made nuclear reactors possible
I think you misunderstood what I was getting at. The reason I object to a Victorian NRC is not that I want to avoid decelerating atomic physics (I don’t even know if I ought to expect that). I object because it’s quixotic. Or just plain silly. There are no nuclear reactors! What are you people in HMNRC even doing all day? Theorycrafting reporting standards for SCRAM incidents? How sure are you that you actually, you know, need to do that?
Thanks, this is extremely helpful. Having a clearer definition of how e/acc is understood to LW makes this much easier to think about.
Just for fun, I’ll quibble: I would add to my list of e/acc heresies
Insofar as I genuinely believe that to some extent, various actors are trying to take advantage of sincerely-held beliefs by LWers in the importance of decel-until-alignment to craft rules which benefit them and their short-term interests in money and power. This is bad, but also people do this sort of thing in our society all the time so you need to have perspective and recognize that it’s not the literal end of the world. I don’t know if I would say it’s the thing we need to worry about, but it’s more likely to cause harm now, whereas AGI is not.
I’d say it was an acceptable risk, and one that we’re running anyway. It’s reasonable to increase the risk slightly in the short run to reduce it in the long run. Is there an outcome with human extinction which I would also consider good? That’s kind of hard to say. Like I think Neanderthal extinction was an acceptable outcome. So clearly “all humans are extinct and now there are only posthumans” is acceptable, for some values of posthuman. I dunno, It’s all extremely academic and taking it too seriously feels silly.
Also
I think you misunderstood what I was getting at. The reason I object to a Victorian NRC is not that I want to avoid decelerating atomic physics (I don’t even know if I ought to expect that). I object because it’s quixotic. Or just plain silly. There are no nuclear reactors! What are you people in HMNRC even doing all day? Theorycrafting reporting standards for SCRAM incidents? How sure are you that you actually, you know, need to do that?