On what basis do you assert you were “reasoned out” of that position?
I’ll admit it’s rather shaky and I’d be saying the same thing if I’d merely been brainwashed. It doesn’t feel like it was precipitated by anything other than legitimate moral argument, though. If I can be brainwashed out of my “terminal values” so easily, and it really doesn’t feel like something to resist, then I’d like a sturdier basis on which to base my moral reasoning.
For example, what about your change of mind causes you to reject a conversation metaphor?
What is a conversation metaphor? I’m afraid I don’t see what you’re getting at.
The other way you might respond is that you have realized that you still value freedom, but have recently realized it is not a terminal value. But that makes the example less useful in figuring out how actual terminal values work.
I still value freedom in what feels like a fundamental way, I just also value hierarchy and social order now. What is gone is the extreme feeling of ickyness attached to authority, and the feeling of sacredness attached to freedom, and the belief that these things were terminal values.
The point is that things I’m likely to identify as “terminal values”, especially in the contexts of disagreements, are simply not that fundamental, and are much closer to derived surface heuristics or even tribal affiliation signals.
I feel like I’m not properly responding to your comment though.
Nyan, I think your freedom example is a little off. The converse of freedom is not bowing down to a leader. It’s being made to bow. People choosing to bow can be beautiful and rational, but I fail to see any beauty in someone bowing when their values dictate they should stand.
I’ll admit it’s rather shaky and I’d be saying the same thing if I’d merely been brainwashed. It doesn’t feel like it was precipitated by anything other than legitimate moral argument, though. If I can be brainwashed out of my “terminal values” so easily, and it really doesn’t feel like something to resist, then I’d like a sturdier basis on which to base my moral reasoning.
This could of course use more detail, unless you understand what I’m getting at.
The point is that things I’m likely to identify as “terminal values”, especially in the contexts of disagreements, are simply not that fundamental, and are much closer to derived surface heuristics or even tribal affiliation signals.
That’s certainly a serious risk, especially if terminal values work like axioms. There’s a strong incentive in debate or policy conflict to claim an instrumental value was terminal just to insulate it from attack. And then, by process of the failure mode identified in Keep Your Identity Small, one is likely to come to believe that the value actually is a terminal value for oneself.
I feel like I’m not properly responding to your comment though.
I took your essay as trying to make a meta-ethical point about “terminal values” and how using the term with an incoherent definition causes confusion in the debate. Parallel to when you said if we interact with an unshielded utility, it’s over, we’ve committed a type error. If that was not your intent, then I’ve misunderstood the essay.
I took your essay as trying to make a meta-ethical point about “terminal values” and how using the term with an incoherent definition causes confusion in the debate. Parallel to when you said if we interact with an unshielded utility, it’s over, we’ve committed a type error. If that was not your intent, then I’ve misunderstood the essay.
Oops, it wasn’t really about how we use terms or anything. I’m trying to communicate that we are not as morally wise as we sometimes pretend to be, or think we are. That Moral Philosophy is an unsolved problem, and we don’t even have a good idea how to solve it (unlike, say physics, where it’s unsolved, but the problem is understood).
This is in preparation for some other posts on the subject, the next of which will be posted tonight or soon.
That Moral Philosophy is an unsolved problem, and we don’t even have a good idea how to solve it
That said there has been centuries of work on the subject, that Eliezer unfortunately through out because VHM-utilitarianism is so mathematically elegant.
. If I can be brainwashed out of my “terminal values” so easily, and it really doesn’t feel like something to resist, then I’d like a sturdier basis on which to base my moral reasoning.
Are you sure you aren’t simply trading open ended beliefs for those that circularly support themselves to a greater extent? When you trust in an authority which tells you to trust in that authority, that’s sturdier.
I still value freedom in what feels like a fundamental way, I just also value hierarchy and social order now.
Gygax would say your alignment has shifted a step toward Lawful. I tend to prefer the Exalted system, which could represent such a shift through the purchase of a third dot in the virtue of Temperance.
I’ll admit it’s rather shaky and I’d be saying the same thing if I’d merely been brainwashed. It doesn’t feel like it was precipitated by anything other than legitimate moral argument, though. If I can be brainwashed out of my “terminal values” so easily, and it really doesn’t feel like something to resist, then I’d like a sturdier basis on which to base my moral reasoning.
What is a conversation metaphor? I’m afraid I don’t see what you’re getting at.
I still value freedom in what feels like a fundamental way, I just also value hierarchy and social order now. What is gone is the extreme feeling of ickyness attached to authority, and the feeling of sacredness attached to freedom, and the belief that these things were terminal values.
The point is that things I’m likely to identify as “terminal values”, especially in the contexts of disagreements, are simply not that fundamental, and are much closer to derived surface heuristics or even tribal affiliation signals.
I feel like I’m not properly responding to your comment though.
Nyan, I think your freedom example is a little off. The converse of freedom is not bowing down to a leader. It’s being made to bow. People choosing to bow can be beautiful and rational, but I fail to see any beauty in someone bowing when their values dictate they should stand.
My fault for failing to clarify. There are roughly three ways one can talk about changes to an agent’s terminal values.
(1) Such changes never happen. (At a society level, this proposition appears to be false).
(2) Such changes happen through rational processes (i.e. reasoning).
(3) Such changes happen through non-rational processes (e.g. tribal affiliation + mindkilling).
I was using “conversion” as a metaphorical shorthand for the third type of change.
BTW, you might want to change “conversation” to “conversion” in the grandparent.
Ah! Thanks.
Ok. Then my answer to that is roughly this:
This could of course use more detail, unless you understand what I’m getting at.
That’s certainly a serious risk, especially if terminal values work like axioms. There’s a strong incentive in debate or policy conflict to claim an instrumental value was terminal just to insulate it from attack. And then, by process of the failure mode identified in Keep Your Identity Small, one is likely to come to believe that the value actually is a terminal value for oneself.
I took your essay as trying to make a meta-ethical point about “terminal values” and how using the term with an incoherent definition causes confusion in the debate. Parallel to when you said if we interact with an unshielded utility, it’s over, we’ve committed a type error. If that was not your intent, then I’ve misunderstood the essay.
Oops, it wasn’t really about how we use terms or anything. I’m trying to communicate that we are not as morally wise as we sometimes pretend to be, or think we are. That Moral Philosophy is an unsolved problem, and we don’t even have a good idea how to solve it (unlike, say physics, where it’s unsolved, but the problem is understood).
This is in preparation for some other posts on the subject, the next of which will be posted tonight or soon.
That said there has been centuries of work on the subject, that Eliezer unfortunately through out because VHM-utilitarianism is so mathematically elegant.
Are you sure you aren’t simply trading open ended beliefs for those that circularly support themselves to a greater extent? When you trust in an authority which tells you to trust in that authority, that’s sturdier.
Gygax would say your alignment has shifted a step toward Lawful. I tend to prefer the Exalted system, which could represent such a shift through the purchase of a third dot in the virtue of Temperance.