X-risks tend to be more complicated beasts than lions in bushes, in that successfully avoiding them requires a lot more than reflexive action: we’re not going to navigate them by avoiding carefully understanding them.
I actually agree entirely. I just don’t think that we need to explore those x-risks by exposing ourselves to them. I think we’ve already advanced AI enough to start understanding and thinking about those x-risks, and an indefinite (perhaps not permanent) pause in development will enable us to get our bearings.
Say what you need to say now to get away from the potential lion. Then back at the campfire, talk it through.
X-risks tend to be more complicated beasts than lions in bushes, in that successfully avoiding them requires a lot more than reflexive action: we’re not going to navigate them by avoiding carefully understanding them.
I actually agree entirely. I just don’t think that we need to explore those x-risks by exposing ourselves to them. I think we’ve already advanced AI enough to start understanding and thinking about those x-risks, and an indefinite (perhaps not permanent) pause in development will enable us to get our bearings.
Say what you need to say now to get away from the potential lion. Then back at the campfire, talk it through.
If there were a game-theoretically reliable way to get everyone to pause all together, I’d support it.