AI research will progress with or without one man’s ideas. EY is impeding alignment research by choking out object-level discussion, without any benefit. Him just entertaining the idea of AGI following one particular path, be it deep learning or bayes nets or whole brain emulation, would give the discussion a basis to start from. Just sharing ideas wouldn’t do anything to push forward capabilities research unless he inadvertently convinced researchers put in the work to implement his ideas. Ideas don’t implement themselves.
AI research will progress with or without one man’s ideas. EY is impeding alignment research by choking out object-level discussion, without any benefit. Him just entertaining the idea of AGI following one particular path, be it deep learning or bayes nets or whole brain emulation, would give the discussion a basis to start from. Just sharing ideas wouldn’t do anything to push forward capabilities research unless he inadvertently convinced researchers put in the work to implement his ideas. Ideas don’t implement themselves.