Like, most humans when given massive power over the universe would probably accidentally destroy themselves, and possibly all of humanity with it (Eliezer talks a bit about this in HPMOR in sections I don’t want to reference because of spoilers, and Wei Dai has talked about this a bit in a bunch of comments related to “the human alignment problem”). I think that maybe I could avoid doing that, but only because I am really mindful of the risk, and I don’t think me from 5 years ago would have been safe to drastically scale up, even with respect to just my own values.
I think also unaligned with yourself?
Like, most humans when given massive power over the universe would probably accidentally destroy themselves, and possibly all of humanity with it (Eliezer talks a bit about this in HPMOR in sections I don’t want to reference because of spoilers, and Wei Dai has talked about this a bit in a bunch of comments related to “the human alignment problem”). I think that maybe I could avoid doing that, but only because I am really mindful of the risk, and I don’t think me from 5 years ago would have been safe to drastically scale up, even with respect to just my own values.