For example, consider a system that takes seriously the idea of souls. One might very well decide that all that matters is whether an entity has a soul, completely separate from its apparent intelligence level. Similarly, a sufficiently racist individual might assign no moral weight to people of some specific racial group, regardless of their intelligence.
Right you are. I did not express myself well above. Let me try and restate, just for the record.
Assuming one does not assign equal rights to all autonomous agents (for instance, if we take the position that a human has more rights than a bacterium), then discriminating based on cognitive capacity (of the species, not the individual) (as one of many possible criteria) is not ipso facto wrong. It may be wrong some of the time, and it may be an approach employed by bigots, but it is not always wrong. This is my present opinion, you understand, not established fact.
there’s the additional problem that I pointed out that it wouldn’t even necessarily be in humanity’s best interest for the entity to have such an ethical system.
Agreed. But this whole business of “we don’t want the superintelligence to burn us with its magnifying glass, so we in turn won’t burn ants with our magnifying glass” strikes me as rather intractable. Even though, of course, it’s essential work.
I would say a few more words, but I think it’s best to stop here. This subthread has cost me 66% of my Karma. :)
Right you are. I did not express myself well above. Let me try and restate, just for the record.
Assuming one does not assign equal rights to all autonomous agents (for instance, if we take the position that a human has more rights than a bacterium), then discriminating based on cognitive capacity (of the species, not the individual) (as one of many possible criteria) is not ipso facto wrong. It may be wrong some of the time, and it may be an approach employed by bigots, but it is not always wrong. This is my present opinion, you understand, not established fact.
Agreed. But this whole business of “we don’t want the superintelligence to burn us with its magnifying glass, so we in turn won’t burn ants with our magnifying glass” strikes me as rather intractable. Even though, of course, it’s essential work.
I would say a few more words, but I think it’s best to stop here. This subthread has cost me 66% of my Karma. :)