I’m curious about how your idea handles an edge case. (I am merely curious—not to downplay curiosity, but you shouldn’t consider it a reason to devote considerable brain-cycles on its own if it’d take considerable brain-cycles to answer, because I think your appropriation of moral terminology is silly and I won’t find the answer useful for any specific purpose.)
The edge case: I have invented an alien species called the Zaee (for freeform roleplaying game purposes; it only recently occurred to me that they have bearing on this topic). The Zaee have wings, and can fly starting in early childhood. They consider it “loiyen” (the Zaee word that most nearly translates as “morally wrong”) for a child’s birth mother to continue raising her offspring (call it a son) once he is ready to take off for the first time; they deal with this by having her entrust her son to a friend, or a friend of the father, or, in an emergency, somebody who’s in a similar bind and can just swap children with her. Someone who has a child without a plan for how to foster him out at the proper time (even if it’s “find a stranger to swap with”) is seen as being just as irresponsible as a human mother who had a child without a clue how she planned to feed him would be (even if it’s “rely on government assistance”).
There is no particular reason why a Zaee child raised to adulthood by his biological mother could not wind up within the Zaee-normal range of psychology (not that they’d ever let this be tested experimentally); however, they’d find this statement about as compelling as the fact that there’s no reason a human child, kidnapped as a two-year-old from his natural parents and adopted by a duped but competent couple overseas, couldn’t grow up to be a normal human: it still seems a dreadful thing to do, and to the child, not just to the parents.
When Zaee interact with humans they readily concede that this precept of their has no bearing on any human action whatever: human children cannot fly. And in the majority of other respects, Zaee are like humans in their - if you plopped a baby Zaee brain in a baby human body (and resolved the body dysphoria and aging rate issues) and he grew up on Earth, he’d be darned quirky, but wouldn’t be diagnosed with a mental illness or anything.
Other possibly relevant information: when Zaee programmers program AIs (not the recursively self-improving kind; much more standard-issue sci-fi types), they apply the same principle, and don’t “keep” the AIs in their own employ past a certain point. (A particular tradition of programming frequently has its graduates arrange beforehand to swap their AIs.) The AIs normally don’t run on mobile hardware, which is irrelevant anyway, because the point in question for them isn’t flight. However, Zaee are not particularly offended by the practice of human programmers keeping their own AIs indefinitely. The Zaee would be very upset if humans genetically engineered themselves to have wings from birth which became usable before adulthood and this didn’t yield a change in human fostering habits. (I have yet to have cause to get a Zaee interacting with another alien species that can also fly in the game for which they were designed, but anticipate that if I did so, “grimly distasteful bare-tolerance” would be the most appropriate attitude for the Zaee in the interaction. They’re not very violent.)
And the question: Are the Zaee “interested in morality”? Are we interested in ? Do the two referents mean distinct concepts that just happen to overlap some or be compatible in a special way? How do you talk about this situation, using the words you have appropriated?
I’m curious about how your idea handles an edge case. (I am merely curious—not to downplay curiosity, but you shouldn’t consider it a reason to devote considerable brain-cycles on its own if it’d take considerable brain-cycles to answer, because I think your appropriation of moral terminology is silly and I won’t find the answer useful for any specific purpose.)
The edge case: I have invented an alien species called the Zaee (for freeform roleplaying game purposes; it only recently occurred to me that they have bearing on this topic). The Zaee have wings, and can fly starting in early childhood. They consider it “loiyen” (the Zaee word that most nearly translates as “morally wrong”) for a child’s birth mother to continue raising her offspring (call it a son) once he is ready to take off for the first time; they deal with this by having her entrust her son to a friend, or a friend of the father, or, in an emergency, somebody who’s in a similar bind and can just swap children with her. Someone who has a child without a plan for how to foster him out at the proper time (even if it’s “find a stranger to swap with”) is seen as being just as irresponsible as a human mother who had a child without a clue how she planned to feed him would be (even if it’s “rely on government assistance”).
There is no particular reason why a Zaee child raised to adulthood by his biological mother could not wind up within the Zaee-normal range of psychology (not that they’d ever let this be tested experimentally); however, they’d find this statement about as compelling as the fact that there’s no reason a human child, kidnapped as a two-year-old from his natural parents and adopted by a duped but competent couple overseas, couldn’t grow up to be a normal human: it still seems a dreadful thing to do, and to the child, not just to the parents.
When Zaee interact with humans they readily concede that this precept of their has no bearing on any human action whatever: human children cannot fly. And in the majority of other respects, Zaee are like humans in their - if you plopped a baby Zaee brain in a baby human body (and resolved the body dysphoria and aging rate issues) and he grew up on Earth, he’d be darned quirky, but wouldn’t be diagnosed with a mental illness or anything.
Other possibly relevant information: when Zaee programmers program AIs (not the recursively self-improving kind; much more standard-issue sci-fi types), they apply the same principle, and don’t “keep” the AIs in their own employ past a certain point. (A particular tradition of programming frequently has its graduates arrange beforehand to swap their AIs.) The AIs normally don’t run on mobile hardware, which is irrelevant anyway, because the point in question for them isn’t flight. However, Zaee are not particularly offended by the practice of human programmers keeping their own AIs indefinitely. The Zaee would be very upset if humans genetically engineered themselves to have wings from birth which became usable before adulthood and this didn’t yield a change in human fostering habits. (I have yet to have cause to get a Zaee interacting with another alien species that can also fly in the game for which they were designed, but anticipate that if I did so, “grimly distasteful bare-tolerance” would be the most appropriate attitude for the Zaee in the interaction. They’re not very violent.)
And the question: Are the Zaee “interested in morality”? Are we interested in ? Do the two referents mean distinct concepts that just happen to overlap some or be compatible in a special way? How do you talk about this situation, using the words you have appropriated?