Perhaps one axis of disagreement between the worldviews of Paul and Eliezer is “human societal competence.” Yudkowsky thinks the world is inadequate and touts the Law of Earlier Failure according to which things break down in an earlier and less dignified way than you would have thought possible. (Plenty of examples from coronavirus pandemic here). Paul puts stock in efficient-market-hypothesis style arguments, updating against <10 year timelines on that basis, expecting slow distributed continuous takeoff, expecting governments and corporations to be taking AGI risk very seriously and enforcing very sophisticated monitoring and alignment schemes, etc.
Perhaps one axis of disagreement between the worldviews of Paul and Eliezer is “human societal competence.” Yudkowsky thinks the world is inadequate and touts the Law of Earlier Failure according to which things break down in an earlier and less dignified way than you would have thought possible. (Plenty of examples from coronavirus pandemic here). Paul puts stock in efficient-market-hypothesis style arguments, updating against <10 year timelines on that basis, expecting slow distributed continuous takeoff, expecting governments and corporations to be taking AGI risk very seriously and enforcing very sophisticated monitoring and alignment schemes, etc.
(From a conversation with Jade Leung)