I would guess GCRs are generally less impactful than pressures that lead our collective preferences to evolve in a way that we wouldn’t like on reflection. Such failures are unrecoverable catastrophes in the sense that we have no desire to recover, but in a pluralistic society they would not necessarily or even typically be global. You could view alignment failures as an example of values drifting, given that the main thing at stake are our preferences about the universe’s future rather than the destruction of earth-originating intelligent life.
I expect this is the kind of thing I would be working on if I thought that alignment risk was less severe. My best guess about what to do is probably just futurism—understanding what is likely to happen and giving us more time to think about that seems great. Maybe eventually that leads to a different priority.
I would guess GCRs are generally less impactful than pressures that lead our collective preferences to evolve in a way that we wouldn’t like on reflection. Such failures are unrecoverable catastrophes in the sense that we have no desire to recover, but in a pluralistic society they would not necessarily or even typically be global. You could view alignment failures as an example of values drifting, given that the main thing at stake are our preferences about the universe’s future rather than the destruction of earth-originating intelligent life.
I expect this is the kind of thing I would be working on if I thought that alignment risk was less severe. My best guess about what to do is probably just futurism—understanding what is likely to happen and giving us more time to think about that seems great. Maybe eventually that leads to a different priority.