I was just claiming that your description of pivotal acts / of people that support pivotal acts was incorrect in a way that people that think pivotal acts are worth considering would consider very significant and in a way that significantly reduces the power of your argument as applying to what people mean by pivotal acts — I don’t see anything in your comment as a response to that claim. I would like it to be a separate discussion whether pivotal acts are a good idea with this in mind.
Now, in this separate discussion: I agree that executing a pivotal act with just a narrow, safe, superintelligence is a difficult problem. That said, all paths to a state of safety from AGI that I can think of seem to contain difficult steps, so I think a more fine-grained analysis of the difficulty of various steps would be needed. I broadly agree with your description of the political character of pivotal acts, but I disagree with what you claim about associated race dynamics — it seems plausible to me that if pivotal acts became the main paradigm, then we’d have a world in which a majority of relevant people are willing to cooperate / do not want to race that much against others in the majority, and it’d mostly be a race between this group and e/acc types. I would also add, though, that the kinds of governance solutions/mechanisms I can think of that are sufficient to (for instance) make it impossible to perform distributed training runs on consumer devices also seem quite authoritarian.
it seems plausible to me that if pivotal acts became the main paradigm, then we’d have a world in which a majority of relevant people are willing to cooperate / do not want to race that much against others in the majority, and it’d mostly be a race between this group and e/acc types
I disagree, I think in many ways the current race already seems motivated by something of the sort—“if I don’t get to it first, they will, and they’re sure to fuck it up”. Though with no apparent planning for pivotal acts in sight (but who knows).
I would also add, though, that the kinds of governance solutions/mechanisms I can think of that are sufficient to (for instance) make it impossible to perform distributed training runs on consumer devices also seem quite authoritarian.
Oh, agreed. It’s a choice between shitty options all around.
I was just claiming that your description of pivotal acts / of people that support pivotal acts was incorrect in a way that people that think pivotal acts are worth considering would consider very significant and in a way that significantly reduces the power of your argument as applying to what people mean by pivotal acts — I don’t see anything in your comment as a response to that claim. I would like it to be a separate discussion whether pivotal acts are a good idea with this in mind.
Now, in this separate discussion: I agree that executing a pivotal act with just a narrow, safe, superintelligence is a difficult problem. That said, all paths to a state of safety from AGI that I can think of seem to contain difficult steps, so I think a more fine-grained analysis of the difficulty of various steps would be needed. I broadly agree with your description of the political character of pivotal acts, but I disagree with what you claim about associated race dynamics — it seems plausible to me that if pivotal acts became the main paradigm, then we’d have a world in which a majority of relevant people are willing to cooperate / do not want to race that much against others in the majority, and it’d mostly be a race between this group and e/acc types. I would also add, though, that the kinds of governance solutions/mechanisms I can think of that are sufficient to (for instance) make it impossible to perform distributed training runs on consumer devices also seem quite authoritarian.
I disagree, I think in many ways the current race already seems motivated by something of the sort—“if I don’t get to it first, they will, and they’re sure to fuck it up”. Though with no apparent planning for pivotal acts in sight (but who knows).
Oh, agreed. It’s a choice between shitty options all around.