That was my understanding, but I think that any world in which there is an AGI that isn’t Friendly probably won’t be very stable.
I think you’re right. The main risk would be Friendly to Someone Else AI.
I think you’re right. The main risk would be Friendly to Someone Else AI.