I read the latter as specifying something like “all existential risks are averted and the world gets much more awesome” as an optimization target, not as something that he wants to (let alone expects to be able to) do completely and singlehandedly.
There is no “singlehandedly”, individual decisions control actions of many people.
There is no “singlehandedly”, individual decisions control actions of many people.