I think some of the assumptions here have lead you to false conclusions. For one, you seem to assume that because humans share some values, all humans have an identical value system. This is just plain wrong, humans each have their own unique value “signature” more or less like a fingerprint. If there is one thing that you place more value weight on than a person who is otherwise identical, you are different. That being said, does your argument still hold with this, albeit minor in the grand scheme of things, heterogeneity added to human value systems? I don’t think so. I think there is plenty of reason to think that human values will be much more robust because of the person-to-person differential.
Furthermore, I think the premise of this article kind of comes back to your claim that boredom is an absolute value. After you claim this, you go on to say how it was evolved over time (which is correct), but still hold that it is absolute (can’t you see the contradiction here?). How can something be absolute if it evolved over time in humans to enhance survival? Further, who’s to say that with the advent of ASI this wouldn’t be “cured” (so to speak). That is, an ASI should be able to detect the cause of human boredom and can thus genetically reprogram us to fix it. How can something that is structural due to evolutionary and environmental components of human development be considered a “human value”? Being a value implies that it somehow transcends biological constraints, I.e. tradition like religion, etc. You are painting boredom as a value when it is little more than an instinct. One can argue that even though something causes a biologically structural change, it constitutes a value. I can concede that, but how can you insist that the universe will have no “point” if these “values” get adjusted to compromise with the existence of an ASI? Value is completely subjective to the organism that holds it. The transhuman will have different values, and the universe will not necessarily contain less values for him/her/it at that time. In fact, it will likely be much richer to them.
Lastly: “A paperclip maximizer just chooses whichever action leads to the greatest number of paperclips.”
I counter with “a biological system just chooses (through natural selection) whichever action leads to greatest number of biological systems”. How did this argument help you, exactly? Humans are subject to the same subjective value that a machine ASI would be subjected to. The only way to pretend that human value isn’t just another component of how humans historically have done this, is by bestowing some sort of transcendent component to human biology (i.e. a soul or something). I think this is a methodological flaw to your argument.
The only way to pretend that human value isn’t just another component of how humans historically have done this, is by bestowing some sort of transcendent component to human biology (i.e. a soul or something).
Human values are special because we are human. Each of us is at the center of the universe, from our own perspective, regardless of what the rest of the universe thinks of that. It’s the only way for anything to have value at all, because there is no other way to choose one set of values over another except that you happen to embody those values. The paperclip maximizer’s goals do not have value with respect to our own, and it is only our own that matter to us.
how can you insist that the universe will have no “point” if these “values” get adjusted to compromise with the existence of an ASI?
A paperclip maximizer could have its values adjusted to want to make staples instead. But what would the paperclip maximizer think of this? Clearly, this would be contrary to its current goal of making paperclips. As a consequence, the paperclip maximizer will not want to permit such a change, since what it would become would be meaningless with respect to its current values. The same principle applies to human beings. I do not want my values to be modified because who I would become would be devalued with respect to my current values. Even if the new me found the universe every bit as rich and meaningful as the old me did, it would be no comfort to me now because the new me’s values would not coincide my current values.
I think some of the assumptions here have lead you to false conclusions. For one, you seem to assume that because humans share some values, all humans have an identical value system. This is just plain wrong, humans each have their own unique value “signature” more or less like a fingerprint. If there is one thing that you place more value weight on than a person who is otherwise identical, you are different. That being said, does your argument still hold with this, albeit minor in the grand scheme of things, heterogeneity added to human value systems? I don’t think so. I think there is plenty of reason to think that human values will be much more robust because of the person-to-person differential.
Furthermore, I think the premise of this article kind of comes back to your claim that boredom is an absolute value. After you claim this, you go on to say how it was evolved over time (which is correct), but still hold that it is absolute (can’t you see the contradiction here?). How can something be absolute if it evolved over time in humans to enhance survival?
Further, who’s to say that with the advent of ASI this wouldn’t be “cured” (so to speak). That is, an ASI should be able to detect the cause of human boredom and can thus genetically reprogram us to fix it. How can something that is structural due to evolutionary and environmental components of human development be considered a “human value”? Being a value implies that it somehow transcends biological constraints, I.e. tradition like religion, etc. You are painting boredom as a value when it is little more than an instinct. One can argue that even though something causes a biologically structural change, it constitutes a value. I can concede that, but how can you insist that the universe will have no “point” if these “values” get adjusted to compromise with the existence of an ASI? Value is completely subjective to the organism that holds it. The transhuman will have different values, and the universe will not necessarily contain less values for him/her/it at that time. In fact, it will likely be much richer to them.
Lastly: “A paperclip maximizer just chooses whichever action leads to the greatest number of paperclips.” I counter with “a biological system just chooses (through natural selection) whichever action leads to greatest number of biological systems”. How did this argument help you, exactly? Humans are subject to the same subjective value that a machine ASI would be subjected to. The only way to pretend that human value isn’t just another component of how humans historically have done this, is by bestowing some sort of transcendent component to human biology (i.e. a soul or something). I think this is a methodological flaw to your argument.
Human values are special because we are human. Each of us is at the center of the universe, from our own perspective, regardless of what the rest of the universe thinks of that. It’s the only way for anything to have value at all, because there is no other way to choose one set of values over another except that you happen to embody those values. The paperclip maximizer’s goals do not have value with respect to our own, and it is only our own that matter to us.
A paperclip maximizer could have its values adjusted to want to make staples instead. But what would the paperclip maximizer think of this? Clearly, this would be contrary to its current goal of making paperclips. As a consequence, the paperclip maximizer will not want to permit such a change, since what it would become would be meaningless with respect to its current values. The same principle applies to human beings. I do not want my values to be modified because who I would become would be devalued with respect to my current values. Even if the new me found the universe every bit as rich and meaningful as the old me did, it would be no comfort to me now because the new me’s values would not coincide my current values.