Again, you’ve pulled a statement out of a discussion the context of the behavior of a self-modifying AI. So, fine. In my current condition I wouldn’t build a baby mulcher. That doesn’t mean that I might not build a baby mucher if I had the ability to change my values. You might as well say that I terminally value not flying when I flap my arms. The thing you’re discussing just isn’t physically allowed. People terminally value only what they’re doing at any given moment because the laws of physics say that they have no choice.
As far as I know terminal values are things that are valuable in an of themselves. I don’t consider not building baby-mulchers to be valuable in and of itself. There may be some scenario in which building baby-mulchers is more valuable to me than not and in that scenario I would build one. Likewise with doomsday devices. It’s difficult to predict what that scenario would look like, but given that other humans have built them I assume that I would too. In those circumstances if I could turn off the parts of my brain that make me squeamish about doing that, I certainly would. I don’t think that not doing horrible things is valuable in and of itself, it’s just away of avoiding feeling horrible. If I could avoid feeling horrible and found value in doing horrible things, then I would probably do them.
People terminally value only what they’re doing at any given moment because the laws of physics say that they have no choice.
Huh? That makes no sense. How do you define “terminal value”?
In the statement that you were responding to, I was defining it the way you seemed to when you said that “some “moral values” are biologically hardwired into humans.” You were saying that given the current state of their hardware, their inability to do something different makes the value terminal. This is analogous to saying that given the current state of the universe, whatever a person is doing at any given moment is a terminal value because of their inability to do something different.
I don’t think that not doing horrible things is valuable in and of itself, it’s just away of avoiding feeling horrible.
OK. I appreciate you biting the bullet.
You were saying that given the current state of their hardware, their inability to do something different makes the value terminal.
No, that is NOT what I am saying. “Biologically hardwired” basically means you are born with these values and while overcoming them is possible, it will take extra effort. It certainly does not mean that you have no choice. Humans do something other than what their biologically hardwired terminal values tell them on a very regular basis. One reason for this is that values are many and they tend not to be consistent.
Again, you’ve pulled a statement out of a discussion the context of the behavior of a self-modifying AI. So, fine. In my current condition I wouldn’t build a baby mulcher. That doesn’t mean that I might not build a baby mucher if I had the ability to change my values. You might as well say that I terminally value not flying when I flap my arms. The thing you’re discussing just isn’t physically allowed. People terminally value only what they’re doing at any given moment because the laws of physics say that they have no choice.
I think you’re confusing “terminal” and “immutable”. Terminal values can and do change.
And why is that? Do you, perchance, have some terminal moral value which disapproves?
Huh? That makes no sense. How do you define “terminal value”?
As far as I know terminal values are things that are valuable in an of themselves. I don’t consider not building baby-mulchers to be valuable in and of itself. There may be some scenario in which building baby-mulchers is more valuable to me than not and in that scenario I would build one. Likewise with doomsday devices. It’s difficult to predict what that scenario would look like, but given that other humans have built them I assume that I would too. In those circumstances if I could turn off the parts of my brain that make me squeamish about doing that, I certainly would. I don’t think that not doing horrible things is valuable in and of itself, it’s just away of avoiding feeling horrible. If I could avoid feeling horrible and found value in doing horrible things, then I would probably do them.
In the statement that you were responding to, I was defining it the way you seemed to when you said that “some “moral values” are biologically hardwired into humans.” You were saying that given the current state of their hardware, their inability to do something different makes the value terminal. This is analogous to saying that given the current state of the universe, whatever a person is doing at any given moment is a terminal value because of their inability to do something different.
OK. I appreciate you biting the bullet.
No, that is NOT what I am saying. “Biologically hardwired” basically means you are born with these values and while overcoming them is possible, it will take extra effort. It certainly does not mean that you have no choice. Humans do something other than what their biologically hardwired terminal values tell them on a very regular basis. One reason for this is that values are many and they tend not to be consistent.
So how does this relate to the discussion on AI?