The only explanation I can think of is that he never bothered to think about AGI and its consequences in detail.
This is really the norm. It’s not part of their field, and it’s unfair to evaluate ability too strongly based on that. Climate scientists won’t necessarily have a great grasp of the issues studied by climate economists and vice versa, and may still be very strong in their domains of expertise.
Also, a brief email from an unknown stranger is not going to elicit a lot of additional thought, and responses are going to be heavily influenced by pattern-matching to other things they have encountered. For instance, accurately disclaiming that the current state of the art in AI is not capable of posing substantial threat because of experience with scared ill-informed people who think that such AI is almost ready.
This is really the norm. It’s not part of their field, and it’s unfair to evaluate ability too strongly based on that. Climate scientists won’t necessarily have a great grasp of the issues studied by climate economists and vice versa, and may still be very strong in their domains of expertise.
Also, a brief email from an unknown stranger is not going to elicit a lot of additional thought, and responses are going to be heavily influenced by pattern-matching to other things they have encountered. For instance, accurately disclaiming that the current state of the art in AI is not capable of posing substantial threat because of experience with scared ill-informed people who think that such AI is almost ready.