Let’s break this all the way down. Can you give me your thesis?
I mean, I see there is a claim here:
The aliens do not want to be exterminated so they should not exterminate.
… of the format (X therefore Y). I can understand what the (X) part of it means: aliens with a preference not to be destroyed. Now the (Y) part is a little murky. You’re saying that the truth of X implies that they “should not exterminate”. What does the word should mean there?
You’re signalling to me right now that you have no desire to have a productive conversation. I don’t know if you’re meaning to do that, but I’m not going to keep asking questions if it seems like you have no intent to answer them.
I.m busy, I’ve answered it several times before, and you can look it up yourself, eg;
“Now we can return to the “special something” that makes a maxim a moral maxim. For Kant it was the maxim’s universalizability. (Note that universalizability is a fundamentally different concept than universality, which refers to the fact that some thing or concept not only should be found everywhere but actually is. However, the two concepts sometimes flow into each other: human rights are said to be universal not in the sense that they are actually conceptualized and respected in all cultures but rather in the sense that reason requires that they should be. And this is a moral “should.”) However, in the course of developing this idea, Kant actually developed several formulations of the Categorical Imperative, all of which turn on the idea of universalizability. Commentators usually list the following five versions:
“Act only according to a maximum that at the same time you could will that it should become a universal law.” In other words, a moral maxim is one that any rationally consistent human being would want to adopt and have others adopt it. The above-mentioned maxim of lying when doing so is to one’s advantage fails this test, since if there were a rule that everyone should lie under such circumstances no one would believe them – which of course is utterly incoherent. Such a maximum destroys the very point of lying.
“Act as if the maxim directing your action should be converted, by your will, into a universal law of nature.” The first version showed that immoral maxims are logically incoherent. The phrase “as if” in this second formulation shows that they are also untenable on empirical grounds. Quite simply, no one would ever want to live in a world that was by its very nature populated only by people living according to immoral maxims.
“Act in a way that treats all humanity, yourself and all others, always as an end, and never simply as a means.” The point here is that to be moral a maxim must be oriented toward the preservation, protection and safeguarding of all human beings, simply because they are beings which are intrinsically valuable, that is to say ends in themselves. Of course much cooperative activity involves “using” others in the weak sense of getting help from them, but moral cooperation always includes the recognition that those who help us are also persons like ourselves and not mere tools to be used to further our own ends.
“Act in a way that your will can regard itself at the same time as making universal law through its maxim.” This version is much like the first one, but it adds the important link between morality and personal autonomy: when we act morally we are actually making the moral law that we follow.
“Act as if by means of your maxims, you were always acting as universal legislator, in a possible kingdom of ends.” Finally, the maxim must be acceptable as a norm or law in a possible kingdom of ends. This formulation brings together the ideas of legislative rationality, universalizability, and autonomy. ”
You mean, “The aliens do not want to be exterminated, so the aliens would prefer that the maxim ‘exterminate X’, when universally quantified over all X, not be universally adhered to.”?
Well… so what? I assume the aliens don’t care about universalisable rules, since they’re in the process of exterminating humanity, and I see no reason to care about such either. What makes this more ‘objective’ than, say, sorting pebbles into correct heaps?
Let’s break this all the way down. Can you give me your thesis?
I mean, I see there is a claim here:
… of the format (X therefore Y). I can understand what the (X) part of it means: aliens with a preference not to be destroyed. Now the (Y) part is a little murky. You’re saying that the truth of X implies that they “should not exterminate”. What does the word should mean there?
It means universalisable rules.
You’re signalling to me right now that you have no desire to have a productive conversation. I don’t know if you’re meaning to do that, but I’m not going to keep asking questions if it seems like you have no intent to answer them.
I.m busy, I’ve answered it several times before, and you can look it up yourself, eg;
“Now we can return to the “special something” that makes a maxim a moral maxim. For Kant it was the maxim’s universalizability. (Note that universalizability is a fundamentally different concept than universality, which refers to the fact that some thing or concept not only should be found everywhere but actually is. However, the two concepts sometimes flow into each other: human rights are said to be universal not in the sense that they are actually conceptualized and respected in all cultures but rather in the sense that reason requires that they should be. And this is a moral “should.”) However, in the course of developing this idea, Kant actually developed several formulations of the Categorical Imperative, all of which turn on the idea of universalizability. Commentators usually list the following five versions:
“Act only according to a maximum that at the same time you could will that it should become a universal law.” In other words, a moral maxim is one that any rationally consistent human being would want to adopt and have others adopt it. The above-mentioned maxim of lying when doing so is to one’s advantage fails this test, since if there were a rule that everyone should lie under such circumstances no one would believe them – which of course is utterly incoherent. Such a maximum destroys the very point of lying.
“Act as if the maxim directing your action should be converted, by your will, into a universal law of nature.” The first version showed that immoral maxims are logically incoherent. The phrase “as if” in this second formulation shows that they are also untenable on empirical grounds. Quite simply, no one would ever want to live in a world that was by its very nature populated only by people living according to immoral maxims.
“Act in a way that treats all humanity, yourself and all others, always as an end, and never simply as a means.” The point here is that to be moral a maxim must be oriented toward the preservation, protection and safeguarding of all human beings, simply because they are beings which are intrinsically valuable, that is to say ends in themselves. Of course much cooperative activity involves “using” others in the weak sense of getting help from them, but moral cooperation always includes the recognition that those who help us are also persons like ourselves and not mere tools to be used to further our own ends.
“Act in a way that your will can regard itself at the same time as making universal law through its maxim.” This version is much like the first one, but it adds the important link between morality and personal autonomy: when we act morally we are actually making the moral law that we follow.
“Act as if by means of your maxims, you were always acting as universal legislator, in a possible kingdom of ends.” Finally, the maxim must be acceptable as a norm or law in a possible kingdom of ends. This formulation brings together the ideas of legislative rationality, universalizability, and autonomy. ”
You mean, “The aliens do not want to be exterminated, so the aliens would prefer that the maxim ‘exterminate X’, when universally quantified over all X, not be universally adhered to.”?
Well… so what? I assume the aliens don’t care about universalisable rules, since they’re in the process of exterminating humanity, and I see no reason to care about such either. What makes this more ‘objective’ than, say, sorting pebbles into correct heaps?