Eliezer: why would it be immoral to build a FAI as a “person”? To rewire a human as Friendly (to dumb aliens) would be immoral because it rewires their goals in a way the original goals would hate. However an AI which comes out of the compiler with Friendly goals would not view being Friendly as a rewire but as its ground state of existence. You seem very confident it’s immoral, so I’m assuming you have a good reason. Please tell.
Also, if there really is an objective moral truth, then it is obligatory to rewire yourself into seeking it. (Whether or not you are willing to do so is a different question; but you ought to be.)
Eliezer: why would it be immoral to build a FAI as a “person”? To rewire a human as Friendly (to dumb aliens) would be immoral because it rewires their goals in a way the original goals would hate. However an AI which comes out of the compiler with Friendly goals would not view being Friendly as a rewire but as its ground state of existence. You seem very confident it’s immoral, so I’m assuming you have a good reason. Please tell.
Also, if there really is an objective moral truth, then it is obligatory to rewire yourself into seeking it. (Whether or not you are willing to do so is a different question; but you ought to be.)