If you missed it, see my comment here. I guess my comment which you responded to was somewhat misleading; I did not intend to claim something about my actual future behavior, rather, I intended simply to make a statement about what I think my future behavior should be.
To put on my Robin Hanson hat, I’d note that you’re acknowledging this level of selflessness to be a Far value and probably not a Near one.
I have strong sympathies toward privileging Far values over Near ones in many of the cases where they conflict in practice, but it doesn’t seem quite accurate to declare that your Far values are your “true” ones and that the Near ones are to be discarded entirely.
So, I think that the right way to conceptualize this is to say that a given person’s values are not fixed but vary with time. I think that at the moment my true values are as I describe. In the course of being tortured, my true values would be very different from the way they are now.
The reason why I generally priviledge Far values over Near values so much is that I value coherence a great deal and I notice that my Near values are very incoherent. But of course if I were being tortured I would have more urgent concerns than coherence.
The Near/Far distinction is about more than just decisions made under duress or temptation. Far values have a strong signaling component, and they’re subject to their own biases.
Can you give an example of a bias which arises from Far values? I should say that I haven’t actually carefully read Hanson’s posts on Near vs. Far modes. In general I think that Hanson’s views of human nature are very misguided (though closer to the truth than is typical).
Okay, thanks for clarifying. I still haven’t read Robin Hanson on Near vs. Far (nor do I have much interest in doing so) but based on your characterization of Far, I would say that I believe that it’s important to strike a balance between Near vs. Far. I don’t really understand what part of my comment orthogonal is/was objecting to—maybe the issue is linguistic/semantic more than anything else.
I see what I say about my values in a neutral state as more representative of my “true values” than what I would say about my values in a state of distress. Yes, if I were actually in need of a heart transplant that would come at the opportunity cost of something of greater social value then I may very well opt for the transplant. But if I could precommit to declining a transplant under such circumstances by pushing a button right now then I would do so.
Similarly, if I were being tortured for a year then if I were given the option to make it stop for a while in exchange for 50 more years of torture later on while being tortured then I might take the option, but I would precommit to not taking such an option if possible.
What you would do has little bearing on what you should do. The above argument doesn’t argue its case. If you are mistaken about your values, of course you can theoretically use those mistaken beliefs to consciously precommit to follow them, no question there.
Err, are you saying that his values are wrong, or just that they’re not in line with majoritarian values?
For one thing, multifoliaterose is probably extrapolating from the values xe signals, which aren’t identical to the values xe acts on. I don’t doubt the sincerity of multifoliaterose’s hypothetical resolve (and indeed I share it), but I suspect that I would find reasons to conclude otherwise were I actually in that situation. (Being signed up for cryonics might make me significantly more willing to actually refuse treatment in such a case, though!)
If you missed it, see my comment here. I guess my comment which you responded to was somewhat misleading; I did not intend to claim something about my actual future behavior, rather, I intended simply to make a statement about what I think my future behavior should be.
To put on my Robin Hanson hat, I’d note that you’re acknowledging this level of selflessness to be a Far value and probably not a Near one.
I have strong sympathies toward privileging Far values over Near ones in many of the cases where they conflict in practice, but it doesn’t seem quite accurate to declare that your Far values are your “true” ones and that the Near ones are to be discarded entirely.
So, I think that the right way to conceptualize this is to say that a given person’s values are not fixed but vary with time. I think that at the moment my true values are as I describe. In the course of being tortured, my true values would be very different from the way they are now.
The reason why I generally priviledge Far values over Near values so much is that I value coherence a great deal and I notice that my Near values are very incoherent. But of course if I were being tortured I would have more urgent concerns than coherence.
The Near/Far distinction is about more than just decisions made under duress or temptation. Far values have a strong signaling component, and they’re subject to their own biases.
Can you give an example of a bias which arises from Far values? I should say that I haven’t actually carefully read Hanson’s posts on Near vs. Far modes. In general I think that Hanson’s views of human nature are very misguided (though closer to the truth than is typical).
Willingness to wreck people’s lives (usually but not always other people’s) for the sake of values which may or may not be well thought out.
This is partly a matter of the signaling aspect, and partly because, since Far values are Far, you’re less likely to be accurate about them.
Okay, thanks for clarifying. I still haven’t read Robin Hanson on Near vs. Far (nor do I have much interest in doing so) but based on your characterization of Far, I would say that I believe that it’s important to strike a balance between Near vs. Far. I don’t really understand what part of my comment orthogonal is/was objecting to—maybe the issue is linguistic/semantic more than anything else.
I’m saying that he acts under a mistaken idea about his true values. He should be more selfish (recognize himself as being more selfish).
I see what I say about my values in a neutral state as more representative of my “true values” than what I would say about my values in a state of distress. Yes, if I were actually in need of a heart transplant that would come at the opportunity cost of something of greater social value then I may very well opt for the transplant. But if I could precommit to declining a transplant under such circumstances by pushing a button right now then I would do so.
Similarly, if I were being tortured for a year then if I were given the option to make it stop for a while in exchange for 50 more years of torture later on while being tortured then I might take the option, but I would precommit to not taking such an option if possible.
What you would do has little bearing on what you should do. The above argument doesn’t argue its case. If you are mistaken about your values, of course you can theoretically use those mistaken beliefs to consciously precommit to follow them, no question there.