Right. Robin doesn’t buy the “AI go foom” model or that formulating and instilling a foolproof morality/utility function will be necessary to save humanity.
I do miss the interplay between the two at OB.
Right. Robin doesn’t buy the “AI go foom” model or that formulating and instilling a foolproof morality/utility function will be necessary to save humanity.
I do miss the interplay between the two at OB.