I agree with everything non-linguistic If we get rid of words like right, wrong, and should, then we are forced to either come up with new words or use ‘want’ and ‘desire’. The first option is confusing and the second can make us seem like egoists or like people who think that wireheading is right because wireheaded people desire it. To someone unfamiliar with this ethical theory, it would be very misleading. Even many of the readers of this website would be confused if we only used words like ‘want’. What we have now is still far from optimal.
If we get rid of words like right, wrong, and should, then we are forced to either come up with new words or use ‘want’ and ‘desire’.
...and ‘preference’ and ‘value’ and so forth. Yes.
If I am talking about current human values, I endorse calling them that, and avoiding introducing new words (like “right”) until there’s something else for those words to designate.
That neither implies that I’m an egoist, nor that I endorse wireheading.
I agree with you that somebody might nevertheless conclude one or both of those things. They’d be mistaken.
I don’t think familiarity with any particular ethical theory is necessary to interpret the lack of a word, though I agree with you that using a word in the absence of a shared theory about its meaning leads to confusion. (I think most usages of “right” fall into this category.)
If you are using ‘right’ to designate something over and above current human values, I endorse you using the word… but I have no idea at the moment what that something is.
I tentatively agree with your wording, though I will have to see if there are any contexts where it fails.
If you are using ‘right’ to designate something over and above current human values, I endorse you using the word… but I have no idea at the moment what that something is.
By definition, wouldn’t humans be unable to want to pursue such a thing?
For example, if humans value X, and “right” designates Y, and aliens edit our brains so we value Y, then we would want to pursue such a thing. Or if Y is a subset of X, we might find it possible to pursue Y instead of X. (I’m less sure about that, though.) Or various other contrived possibilities.
Yes, my statement was way too strong. In fact, it should be much weaker than even what you say; just start a religion that tells people to value Y. I was attempting to express an actual idea that I had with this sentence originally, but my idea was wrong, so never mind.
But supposing it were true, why would it matter?
What does this mean? Supposing that something were right, what would it matter to humans? You could get it to matter to humans by exploiting their irrationality, but if CEV works, it would not matter to that.
What would it even mean for this to be true? You’d need a definition of right.
I agree with everything non-linguistic If we get rid of words like right, wrong, and should, then we are forced to either come up with new words or use ‘want’ and ‘desire’. The first option is confusing and the second can make us seem like egoists or like people who think that wireheading is right because wireheaded people desire it. To someone unfamiliar with this ethical theory, it would be very misleading. Even many of the readers of this website would be confused if we only used words like ‘want’. What we have now is still far from optimal.
...and ‘preference’ and ‘value’ and so forth. Yes.
If I am talking about current human values, I endorse calling them that, and avoiding introducing new words (like “right”) until there’s something else for those words to designate.
That neither implies that I’m an egoist, nor that I endorse wireheading.
I agree with you that somebody might nevertheless conclude one or both of those things. They’d be mistaken.
I don’t think familiarity with any particular ethical theory is necessary to interpret the lack of a word, though I agree with you that using a word in the absence of a shared theory about its meaning leads to confusion. (I think most usages of “right” fall into this category.)
If you are using ‘right’ to designate something over and above current human values, I endorse you using the word… but I have no idea at the moment what that something is.
I tentatively agree with your wording, though I will have to see if there are any contexts where it fails.
By definition, wouldn’t humans be unable to want to pursue such a thing?
Not necessarily.
For example, if humans value X, and “right” designates Y, and aliens edit our brains so we value Y, then we would want to pursue such a thing. Or if Y is a subset of X, we might find it possible to pursue Y instead of X. (I’m less sure about that, though.) Or various other contrived possibilities.
But supposing it were true, why would it matter?
Yes, my statement was way too strong. In fact, it should be much weaker than even what you say; just start a religion that tells people to value Y. I was attempting to express an actual idea that I had with this sentence originally, but my idea was wrong, so never mind.
What does this mean? Supposing that something were right, what would it matter to humans? You could get it to matter to humans by exploiting their irrationality, but if CEV works, it would not matter to that.
What would it even mean for this to be true? You’d need a definition of right.