How is this relevant? We haven’t hit AGI yet, so of course slowing progress wouldn’t have prevented anything bad from happening YET. What we’re really worried about is human extinction, not bias and job loss.
The analogy to nukes is a red herring. Nukes are nukes and AGI is AGI. They have different sets of risk factors. In particular, AGI doesn’t seem to allow for mutually assured destruction, which is the unexpected dynamic that has made nukes not kill us—yet.
As for everything we would’ve missed out on—how much better is dall-E 3 really making the world?
I like technology a lot, as you seem to. But my rational mind agrees with OP that we are driving straight at a cliff and we’re not even talking about how to hit the brakes.
So we’d only kill 99% of us and set civilization back 200 years? Great.
This isn’t super relevant to alignment, but it’s interesting that this is actually the opposite of why nukes haven’t killed us yet. The more we believe a nuclear exchange is survivable, the less the mutually assured destruction assumption keeps anyone from firing.
How is this relevant? We haven’t hit AGI yet, so of course slowing progress wouldn’t have prevented anything bad from happening YET. What we’re really worried about is human extinction, not bias and job loss.
The analogy to nukes is a red herring. Nukes are nukes and AGI is AGI. They have different sets of risk factors. In particular, AGI doesn’t seem to allow for mutually assured destruction, which is the unexpected dynamic that has made nukes not kill us—yet.
As for everything we would’ve missed out on—how much better is dall-E 3 really making the world?
I like technology a lot, as you seem to. But my rational mind agrees with OP that we are driving straight at a cliff and we’re not even talking about how to hit the brakes.
There are other reasons why nukes haven’t killed us yet—all the known mechanisms for destruction are too small, including nuclear winter.
So we’d only kill 99% of us and set civilization back 200 years? Great.
This isn’t super relevant to alignment, but it’s interesting that this is actually the opposite of why nukes haven’t killed us yet. The more we believe a nuclear exchange is survivable, the less the mutually assured destruction assumption keeps anyone from firing.