That part seems less controversial than the others—no Reformation and no Enlightenment means no science, which means no GAI, which means no uFAI. Although I’m sure there’s multiple disjunctive ways science might have come about, it is rather strange that it took humanity so long.
As it is, LW—or at least Yudkowsky—believes uFAI is more likely than FAI, and is the most concerning current existential risk.
I don’t really see how retaining the divine right of kings would have forestalled all the other existential risks, though.
That part seems less controversial than the others—no Reformation and no Enlightenment means no science, which means no GAI, which means no uFAI. Although I’m sure there’s multiple disjunctive ways science might have come about, it is rather strange that it took humanity so long.
As it is, LW—or at least Yudkowsky—believes uFAI is more likely than FAI, and is the most concerning current existential risk.
I don’t really see how retaining the divine right of kings would have forestalled all the other existential risks, though.