Have you read his paper on CEV? To the best of my knowledge, that’s the clearest place he’s laid out what he wants an AGI to do, and I wouldn’t really label it “take over the world and do what [Eliezer Yudkowsky] wants” except for broad use of those terms to the point of dropping their typical connotations.
Have you read his paper on CEV? To the best of my knowledge, that’s the clearest place he’s laid out what he wants an AGI to do, and I wouldn’t really label it “take over the world and do what [Eliezer Yudkowsky] wants” except for broad use of those terms to the point of dropping their typical connotations.