Sure, and if there was some way to quantify the risks accurately I would agree with pausing AGI research if the expected value of the risks were less than the potential benefit.
Oh and pausing was even possible.
All it takes is a rival power, which there are several, or just a rival company and you have no choice. You must take the risk because it might be a poisoned banana or it might be giving the other primate a rocket launcher in a sticks and stones society.
This does explain why EY is so despondent. If he’s right it doesn’t matter, the AI wars have begun and only if it doesn’t work from a technical level will things slow down ever again.
Correctness of EY’s position (being infeasible to assess) is unrelated to the question of what EY’s position is, which is what I was commenting on.
When you argue against the position that AGI research should be stopped because it might be dangerous, there is no need to additionally claim that someone in particular holds that position, especially when it seems clear that they don’t.
Sure, and if there was some way to quantify the risks accurately I would agree with pausing AGI research if the expected value of the risks were less than the potential benefit.
Oh and pausing was even possible.
All it takes is a rival power, which there are several, or just a rival company and you have no choice. You must take the risk because it might be a poisoned banana or it might be giving the other primate a rocket launcher in a sticks and stones society.
This does explain why EY is so despondent. If he’s right it doesn’t matter, the AI wars have begun and only if it doesn’t work from a technical level will things slow down ever again.
Correctness of EY’s position (being infeasible to assess) is unrelated to the question of what EY’s position is, which is what I was commenting on.
When you argue against the position that AGI research should be stopped because it might be dangerous, there is no need to additionally claim that someone in particular holds that position, especially when it seems clear that they don’t.