If a system identifies a human as an important part of itself it will strive to protect it and its normal functioning, as we instinctively protect important parts of ourselves such as the head and genitals.
I think where this actually leads is to an augmented human, or to symbiosis.
For it to be stable, you need the human also to alter (to consider the AI as something it protects as a vital organ, rather than as a handbag-like accessory or even as a trusted servant) or at least to have a convincing reason to consistently act in the AI’s self interest.
What do I mean by “stable”? It is a relationship where, if you distort it or break one of the constraints, the natural tendency is to move back to how things were, rather than move further away.
Relationships based upon coercion, deception or unequal distribution of benefits are not stable.
I think where this actually leads is to an augmented human, or to symbiosis.
For it to be stable, you need the human also to alter (to consider the AI as something it protects as a vital organ, rather than as a handbag-like accessory or even as a trusted servant) or at least to have a convincing reason to consistently act in the AI’s self interest.
What do I mean by “stable”? It is a relationship where, if you distort it or break one of the constraints, the natural tendency is to move back to how things were, rather than move further away.
Relationships based upon coercion, deception or unequal distribution of benefits are not stable.