You are mentioning some aspects keeping from or motivating for suicide. This is the whole point. Suicide is a thinkable option.
Yes, but people generally know what it entails. We don’t want an AI agent to be completely incapable of destroying itself. We don’t want it do destroy itself without a good cause.
Crashing with its spaceship on an incoming asteroid to deflect it away from Earth would be a good cause, for instance.
a cultist fraudster convinced unhappy people to gift their wellth to some other person and commit suicide with the cultistly embellished promise that they’d awake in the body of the other person at another place. Now that wouldn’t convince me, but could it convince AIXI?
If AIXI had a sufficient amount of experience of the world, I think it couldn’t.
Yes, but people generally know what it entails.
We don’t want an AI agent to be completely incapable of destroying itself. We don’t want it do destroy itself without a good cause. Crashing with its spaceship on an incoming asteroid to deflect it away from Earth would be a good cause, for instance.
If AIXI had a sufficient amount of experience of the world, I think it couldn’t.