Is a FAI friendly towards superintelligent, highly conscious uFAI? No, it’s not. It will kill it.
Are you sure? Random alternative possibilities:
Hack it and make it friendly
Assimilate it
Externally constrain its actions
Toss it into another universe where humanity doesn’t exist
Unless you’re one yourself, it’s rather difficult to predict what other options a superintelligence might come up with, that you never even considered.
Are you sure? Random alternative possibilities:
Hack it and make it friendly
Assimilate it
Externally constrain its actions
Toss it into another universe where humanity doesn’t exist
Unless you’re one yourself, it’s rather difficult to predict what other options a superintelligence might come up with, that you never even considered.