Half specifically referred to “creating a successor that shares it’s goals”; this is the problem we face when building an FAI. Nobody is saying an agent with arbitrary goals must at some point face the challenge of building an FAI.
(Incidentally, while Friendly is anthropocentric by default, in common usage analogous concepts relating to other species are referred to as “Friendly to X” of “X-Friendly”, just a good is by default used to mean by human standards, but is sometimes used in “good for X”.
Half specifically referred to “creating a successor that shares it’s goals”; this is the problem we face when building an FAI. Nobody is saying an agent with arbitrary goals must at some point face the challenge of building an FAI.
(Incidentally, while Friendly is anthropocentric by default, in common usage analogous concepts relating to other species are referred to as “Friendly to X” of “X-Friendly”, just a good is by default used to mean by human standards, but is sometimes used in “good for X”.