The important thing about wireheading in this context is that desires after being wireheaded do not matter. The pleasure is irrelevant for this purpose; we could just as easily imagine humans being wireheaded to feel pain, but to desire continuing to feel pain. The point is that what is right should be pursued because it is right, not because people desire it. People’s desires are useful as a way of determining what is right, but if it is known that people desires were altered in some way, they stop providing evidence as to what is right. This understanding is essential to a superintelligence considering the best way to alter peoples brains.
The pleasure is irrelevant for this purpose; we could just as easily imagine humans being wireheaded to feel pain, but to desire continuing to feel pain. The point is that what is right should be pursued because it is right, not because people desire it.
That’s expressed very clearly, thanks. I don’t want to sound rude, I honestly want to understand this. I’m reading your comment and can’t help but think that you are arguing about some kind of universal right. I still can’t pinpoint the argument. Why isn’t it completely arbitrary if we desire to feel pain or pleasure? Is the right answer implied by our evolutionary history? That’s a guess, I’m confused.
People’s desires are useful as a way of determining what is right, but if it is known that people desires were altered in some way, they stop providing evidence as to what is right.
Aren’t our desires altered constantly by mutation, nurture, culture and what we experience and learn? Where can you find the purity of human desire?
I get that you are having trouble understanding this; it is hard and I am much worse at explaining thing in text than in person.
What is right is universal in the sense that what is right would not change if our brains were different. The fact that we care about what is right is caused by our evolutionary history. If we evolved differently, we would have different values, wanting what is gleerp rather than what is right. The differences would be arbitrary to most minds, but not to us. One of the problems of friendliness is ensuring that it is not arbitrary to the AI either.
Aren’t our desires altered constantly by mutation, nurture, culture and what we experience and learn?
There are two types of this; we may learn more about our own values, which is good and which Eliezer believes to be the cause of “moral progress”, or our values may really change. The second type of changes to our desires really are bad. People actually do this, like those who refuse to expose themselves to violence because they think that it will desensitize them from violence. They are really just refusing to take Gandhi’s murder pill, but on a smaller scale. If you have a transtemporal disagreement with your future self on what action you future self should take, your future self will win, because you will no longer exist. The only way to prevent this is to simply refuse to allow your values to change, preventing your future self from disagreeing with you in the first place.
I don’t know what you mean by “purity of human desire”.
The important thing about wireheading in this context is that desires after being wireheaded do not matter. The pleasure is irrelevant for this purpose; we could just as easily imagine humans being wireheaded to feel pain, but to desire continuing to feel pain. The point is that what is right should be pursued because it is right, not because people desire it. People’s desires are useful as a way of determining what is right, but if it is known that people desires were altered in some way, they stop providing evidence as to what is right. This understanding is essential to a superintelligence considering the best way to alter peoples brains.
That’s expressed very clearly, thanks. I don’t want to sound rude, I honestly want to understand this. I’m reading your comment and can’t help but think that you are arguing about some kind of universal right. I still can’t pinpoint the argument. Why isn’t it completely arbitrary if we desire to feel pain or pleasure? Is the right answer implied by our evolutionary history? That’s a guess, I’m confused.
Aren’t our desires altered constantly by mutation, nurture, culture and what we experience and learn? Where can you find the purity of human desire?
I get that you are having trouble understanding this; it is hard and I am much worse at explaining thing in text than in person.
What is right is universal in the sense that what is right would not change if our brains were different. The fact that we care about what is right is caused by our evolutionary history. If we evolved differently, we would have different values, wanting what is gleerp rather than what is right. The differences would be arbitrary to most minds, but not to us. One of the problems of friendliness is ensuring that it is not arbitrary to the AI either.
There are two types of this; we may learn more about our own values, which is good and which Eliezer believes to be the cause of “moral progress”, or our values may really change. The second type of changes to our desires really are bad. People actually do this, like those who refuse to expose themselves to violence because they think that it will desensitize them from violence. They are really just refusing to take Gandhi’s murder pill, but on a smaller scale. If you have a transtemporal disagreement with your future self on what action you future self should take, your future self will win, because you will no longer exist. The only way to prevent this is to simply refuse to allow your values to change, preventing your future self from disagreeing with you in the first place.
I don’t know what you mean by “purity of human desire”.