Given the context of Eliezer’s life-mission and the general agreement of Robin & Eliezer: FAI, AI’s timing, and its general character.
Right. Robin doesn’t buy the “AI go foom” model or that formulating and instilling a foolproof morality/utility function will be necessary to save humanity.
I do miss the interplay between the two at OB.
Given the context of Eliezer’s life-mission and the general agreement of Robin & Eliezer: FAI, AI’s timing, and its general character.
Right. Robin doesn’t buy the “AI go foom” model or that formulating and instilling a foolproof morality/utility function will be necessary to save humanity.
I do miss the interplay between the two at OB.