I don’t know enough to have a valuable opinion on the wider argument, but this sentence:
“EY is a smart guy and I’m sure he could contribute to accelerating AI if he wanted to, but I don’t think him withholding information from us does anything to delay AI.”
AI research will progress with or without one man’s ideas. EY is impeding alignment research by choking out object-level discussion, without any benefit. Him just entertaining the idea of AGI following one particular path, be it deep learning or bayes nets or whole brain emulation, would give the discussion a basis to start from. Just sharing ideas wouldn’t do anything to push forward capabilities research unless he inadvertently convinced researchers put in the work to implement his ideas. Ideas don’t implement themselves.
I don’t know enough to have a valuable opinion on the wider argument, but this sentence:
“EY is a smart guy and I’m sure he could contribute to accelerating AI if he wanted to, but I don’t think him withholding information from us does anything to delay AI.”
seems straightforwardly self-contradictory.
AI research will progress with or without one man’s ideas. EY is impeding alignment research by choking out object-level discussion, without any benefit. Him just entertaining the idea of AGI following one particular path, be it deep learning or bayes nets or whole brain emulation, would give the discussion a basis to start from. Just sharing ideas wouldn’t do anything to push forward capabilities research unless he inadvertently convinced researchers put in the work to implement his ideas. Ideas don’t implement themselves.