If Omega materialized and told you Robin was correct and you are wrong, what do you do for the next week? The next decade?
About what? Everything?
Given the context of Eliezer’s life-mission and the general agreement of Robin & Eliezer: FAI, AI’s timing, and its general character.
Right. Robin doesn’t buy the “AI go foom” model or that formulating and instilling a foolproof morality/utility function will be necessary to save humanity.
I do miss the interplay between the two at OB.
If Omega materialized and told you Robin was correct and you are wrong, what do you do for the next week? The next decade?
About what? Everything?
Given the context of Eliezer’s life-mission and the general agreement of Robin & Eliezer: FAI, AI’s timing, and its general character.
Right. Robin doesn’t buy the “AI go foom” model or that formulating and instilling a foolproof morality/utility function will be necessary to save humanity.
I do miss the interplay between the two at OB.