It is a fact which I know about my self and which I add here.
Relevant here is WHY you can’t kill. Is it because you have a deontological rule against killing? Then you want the AI to have deontologist ethics. Is it because you believe you should kill but don’t have the emotional fortitude to do so? The AI will have no such qualms.
It is more like ultimatum in territory which was recently discussed on LW. It is a fact which I know about myself. I think it has both emotional and rational roots but not limited by them.
So I also want other people to follow it and of course AI too. I also think that AI is able to find a way out of any trolley stile problems.
Relevant here is WHY you can’t kill. Is it because you have a deontological rule against killing? Then you want the AI to have deontologist ethics. Is it because you believe you should kill but don’t have the emotional fortitude to do so? The AI will have no such qualms.
It is more like ultimatum in territory which was recently discussed on LW. It is a fact which I know about myself. I think it has both emotional and rational roots but not limited by them. So I also want other people to follow it and of course AI too. I also think that AI is able to find a way out of any trolley stile problems.