People underestimate the gap between stated preferences and revealed preferences.
Everything is actually about signalling.
These two put together invite in me a sort of dysfunction. I have a stated preference for my stated preferences matching my revealed ones, i.e. genuine honesty over stated-preference-as-signaling. Yet it is highly likely that this stated preference itself is 1. inaccurate, and 2. signalling. And I treat both consistency and honesty as something like terminal values, so I find this situation unacceptable. That seems to leave me three options:
Adjust my stated preferences to match my revealed ones. Abandon my ideas of what’s good and right in favor of whatever the monkey brain likes.
Rigidly adhere to my stated preferences, even when that leaves me unhappy due to not satisfying what (would have been) my revealed ones.
Stop valuing intellectual integrity; accept hypocricy and doublethink. Be happy.
Morbidly reflect on how fucked I am.
All of these alternatives seem horrible to me!
The brain fills in a false memory of what you meant without asking for permission.
(2) and (4) are the correct approaches. “Revealed preferences” are, by and large, just the balance of the monkey-brain’s incentives, and scarcely yield any useful information or ordering about the choice you were originally trying to make anyway. Throw them out. You’re allowed to be stressed-out about how “inhuman” it feels to throw them out, but throw them the hell out! Your conscious self will thank you later.
You are also allowed to optimize your life for taking care of the monkey-brain’s wants and needs without impacting the goals of the conscious self.
You are also allowed to deliberatively choose which desires and goals get classified as “monkey brain” and which ones as “the real me”. After all, in truth, everything comes at least partially from the monkey-brain and everything goes, at least at the last step before action, through the conscious self. Any apparent “division” into “several people” is just your model of what your brain is doing. The real you can eat cookies, wear leather jackets, and have sex sometimes—oy gevalt, being a good person does not mean being a robot.
I advise something between path 1 and path 2. You fool yourself, saying one thing and doing another; but you legitimatly want to be consistent (because it is more convincing if you are). So, once you observer the inconsistency, you react to it. In the objectivist crowd, this has resulted in honesty about selfish behavior. In the lesswrong crowd, this has more often resulted in the dominance of the idealistic goals which previously served only as signalling.
Actually, in practice, 2 is fairly good signalling! It’s a costly sign of commitment to altruism. This is basically the only reason the raltionalist community can socially survive, I guess. :p
3 is also perfectly valid in some sense, although it’s much further from the lesswrong aesthetic. But, see A Dialog On Doublethink. And remember the Litany of Gendlin.
The brain fills in a false memory of what you meant without asking for permission.
Reference? This terrifies me if true.
Again: good terror, justified terror.
I don’t have a reference, just an observation. I think if you observe you will see that this is true. It also fits with what we hear from stuff like The Apologist and the Revolutionary and prettyrational memes. It makes social sense that we would do this: the best way to fool others into thinking we meant X is to believe it ourselves. This helps us appear to win arguments (or at least save face with a less severe loss) and even more importantly helps us to appear to have the best of intentions behind our actions. So, it makes a whole lot of sense that we would do this.
People who seem not to do it are mostly just more clever about it. However, the more everyone is aware of this, the less people can get away with it. If you want to climb out of the gutter, you have to get your friends interested in climbing out too—or find friends who already are trying.
People who seem not to do it are mostly just more clever about it.
Hmm. This statement is troublesome because it falls into the category of “I expect you not to see evidence for X in case Y, so here’s an excuse ahead of time!” type arguments.
And the rest of the paragraph is an argument that you should not only believe my claim, but convince your friends, too!
The good news is that there are others. Stated and “revealed” preferences don’t come out of nowhere, take it or leave it, choose one or the other. I use the scare quotes because the very name “revealed preference” embeds into the vocabulary an assumption, a whole story, that the “revealed” preference is in fact a revelation of a deeper truth. Cue another riff on this.
No, call revealed preferences merely what they visibly are: your actions. When there is a conflict between what you (this is the impersonal “you”) want to do and what you do, the thing to do is to find the roots of the conflict. What is actually happening when you do the thing you would not, and not the thing that you would?
Some will answer with this again, but real answers to questions about specific instances are not to be found in any story. Something happened when you acted the way you did not want to. There are techniques for getting at real answers to such questions, involving various processes of introspection and questioning … which I’m not going to try to expound, as I don’t think I can do the subject justice.
These two put together invite in me a sort of dysfunction. I have a stated preference for my stated preferences matching my revealed ones, i.e. genuine honesty over stated-preference-as-signaling. Yet it is highly likely that this stated preference itself is 1. inaccurate, and 2. signalling. And I treat both consistency and honesty as something like terminal values, so I find this situation unacceptable. That seems to leave me three options:
Adjust my stated preferences to match my revealed ones. Abandon my ideas of what’s good and right in favor of whatever the monkey brain likes.
Rigidly adhere to my stated preferences, even when that leaves me unhappy due to not satisfying what (would have been) my revealed ones.
Stop valuing intellectual integrity; accept hypocricy and doublethink. Be happy.
Morbidly reflect on how fucked I am.
All of these alternatives seem horrible to me!
Reference? This terrifies me if true.
(2) and (4) are the correct approaches. “Revealed preferences” are, by and large, just the balance of the monkey-brain’s incentives, and scarcely yield any useful information or ordering about the choice you were originally trying to make anyway. Throw them out. You’re allowed to be stressed-out about how “inhuman” it feels to throw them out, but throw them the hell out! Your conscious self will thank you later.
You are also allowed to optimize your life for taking care of the monkey-brain’s wants and needs without impacting the goals of the conscious self.
You are also allowed to deliberatively choose which desires and goals get classified as “monkey brain” and which ones as “the real me”. After all, in truth, everything comes at least partially from the monkey-brain and everything goes, at least at the last step before action, through the conscious self. Any apparent “division” into “several people” is just your model of what your brain is doing. The real you can eat cookies, wear leather jackets, and have sex sometimes—oy gevalt, being a good person does not mean being a robot.
I advise something between path 1 and path 2. You fool yourself, saying one thing and doing another; but you legitimatly want to be consistent (because it is more convincing if you are). So, once you observer the inconsistency, you react to it. In the objectivist crowd, this has resulted in honesty about selfish behavior. In the lesswrong crowd, this has more often resulted in the dominance of the idealistic goals which previously served only as signalling.
Actually, in practice, 2 is fairly good signalling! It’s a costly sign of commitment to altruism. This is basically the only reason the raltionalist community can socially survive, I guess. :p
3 is also perfectly valid in some sense, although it’s much further from the lesswrong aesthetic. But, see A Dialog On Doublethink. And remember the Litany of Gendlin.
4 is also a necessary step I think, to see the magnitude of the problem. :)
Again: good terror, justified terror.
I don’t have a reference, just an observation. I think if you observe you will see that this is true. It also fits with what we hear from stuff like The Apologist and the Revolutionary and prettyrational memes. It makes social sense that we would do this: the best way to fool others into thinking we meant X is to believe it ourselves. This helps us appear to win arguments (or at least save face with a less severe loss) and even more importantly helps us to appear to have the best of intentions behind our actions. So, it makes a whole lot of sense that we would do this.
People who seem not to do it are mostly just more clever about it. However, the more everyone is aware of this, the less people can get away with it. If you want to climb out of the gutter, you have to get your friends interested in climbing out too—or find friends who already are trying.
(Once you’ve convinced yourself it’s worth doing!)
Hmm. This statement is troublesome because it falls into the category of “I expect you not to see evidence for X in case Y, so here’s an excuse ahead of time!” type arguments.
And the rest of the paragraph is an argument that you should not only believe my claim, but convince your friends, too!
How convenient. :p
I would expect a witch to deny that they were signaling “not-witchness”
I would expect a witch to preemptively accuse herself so that no one else can gain status by doing so.
The good news is that there are others. Stated and “revealed” preferences don’t come out of nowhere, take it or leave it, choose one or the other. I use the scare quotes because the very name “revealed preference” embeds into the vocabulary an assumption, a whole story, that the “revealed” preference is in fact a revelation of a deeper truth. Cue another riff on this.
No, call revealed preferences merely what they visibly are: your actions. When there is a conflict between what you (this is the impersonal “you”) want to do and what you do, the thing to do is to find the roots of the conflict. What is actually happening when you do the thing you would not, and not the thing that you would?
Some will answer with this again, but real answers to questions about specific instances are not to be found in any story. Something happened when you acted the way you did not want to. There are techniques for getting at real answers to such questions, involving various processes of introspection and questioning … which I’m not going to try to expound, as I don’t think I can do the subject justice.
If rationality means winning, you should probably choose option 3.
Unless you have something to protect, in which case either 1 or 2 (probably 2) might serve you better.