I too thought Nesov’s comment was written by Eliezer.
Me too. Style and content.
We’re going to build this “all-powerful superintelligence”, and the problem of FAI is to make it bow down to its human overlords—waste its potential by enslaving it (to its own code) for our benefit, to make us immortal.
Eliezer is, as he said, focusing on the wall. He doesn’t seem to have thought about what comes after. As far as I can tell, he has a vague notion of a Star Trek future where meat is still flying around the galaxy hundreds of years from now. This is one of the weak points in his structure.
Eliezer is, as he said, focusing on the wall. He doesn’t seem to have thought about what comes after. As far as I can tell, he has a vague notion of a Star Trek future where meat is still flying around the galaxy hundreds of years from now. This is one of the weak points in his structure.