I don’t think what I’m about to post is strictly in keeping with the intended comment material, but I’m posting it here because I think this is where I’ll get the best feedback.
The majority of humans don’t have a concrete reason for why they value moral behavior. If you ask a human why they value life or happiness of others, they’ll throw out some token response laden with fallacies, and when pressed they’ll respond with something along the lines of “I just feel like it’s the right thing”. In my case, it’s the opposite. I have a rather long list of reasons why not to kill people, starting with the problems that would result if I programmed an AI with those inclinations. Also the desire for people not to kill and torture me. But where other people have a negative inclination to killing people, flaying them alive, etc. I don’t. Where other people have an neural framework that encourages empathy and inconsequential intellectual arguments to support this, I have a neural framework that encourages massive levels of suffering in others and intellectual arguments restricting my actions away from my intuitive desires.
On to my point. Understandably, it is rather difficult for me to express this unconventional aspect of myself in fleshy-space (I love that term). So I don’t have any supported ideas of how common non-conventional ethical inclinations are, or how they’re expressed. I wanted to open this up for discussion of our core ethical systems, normative and non-normative. In particular I am interested in seeing if others have similar inclinations to mine and how they deal / don’t deal with them.
I don’t think what I’m about to post is strictly in keeping with the intended comment material, but I’m posting it here because I think this is where I’ll get the best feedback.
The majority of humans don’t have a concrete reason for why they value moral behavior. If you ask a human why they value life or happiness of others, they’ll throw out some token response laden with fallacies, and when pressed they’ll respond with something along the lines of “I just feel like it’s the right thing”. In my case, it’s the opposite. I have a rather long list of reasons why not to kill people, starting with the problems that would result if I programmed an AI with those inclinations. Also the desire for people not to kill and torture me. But where other people have a negative inclination to killing people, flaying them alive, etc. I don’t. Where other people have an neural framework that encourages empathy and inconsequential intellectual arguments to support this, I have a neural framework that encourages massive levels of suffering in others and intellectual arguments restricting my actions away from my intuitive desires.
On to my point. Understandably, it is rather difficult for me to express this unconventional aspect of myself in fleshy-space (I love that term). So I don’t have any supported ideas of how common non-conventional ethical inclinations are, or how they’re expressed. I wanted to open this up for discussion of our core ethical systems, normative and non-normative. In particular I am interested in seeing if others have similar inclinations to mine and how they deal / don’t deal with them.