I’ve been thinking about this with regards to Less Wrong culture. I had pictured your “deliberative thinking” module as more of an “excuse generator”—the rest of your mind would make its decisions, and then the excuse generator comes up with an explanation for them.
The excuse generator is primarily social—it will build excuses which are appropriate to the culture it is in. So in a rationalist culture, it will come up with rationalizing excuses. It can be exposed to a lot of memes, parrot them back and reason using them without actually affecting your behavior in any way at all.
Just sometimes though, the excuse generator will fail and send a signal back to the rest of the mind that it really needs to change something, else it will face social consequences.
The thing is, I don’t feel that this stuff is new. But try and point it out to anyone, and they will generate excuses as to why it doesn’t matter, or why everyone lacks the power of agency except them, or that it’s an interesting question they’ll get around to looking at sometime.
I had pictured your “deliberative thinking” module as more of an “excuse generator”—the rest of your mind would make its decisions, and then the excuse generator comes up with an explanation for them.
I know this is a year or two late, but: I’ve noticed this and find it incredibly frustrating. Turning introspection (yes, I know) on my own internally-stated motivations more often than not reveals them to be either excuses or just plain bullshit. The most frequent failure mode is finding that I did , not because it was good, but because I wanted to be seen as the sort of person who would do it. Try though I might, it seems incredibly difficult to get my brain to not output Frankfurtian Bullshit.
I sort-of-intend to write a post about it one of these days.
I’ve been thinking about this with regards to Less Wrong culture. I had pictured your “deliberative thinking” module as more of an “excuse generator”—the rest of your mind would make its decisions, and then the excuse generator comes up with an explanation for them.
The excuse generator is primarily social—it will build excuses which are appropriate to the culture it is in. So in a rationalist culture, it will come up with rationalizing excuses. It can be exposed to a lot of memes, parrot them back and reason using them without actually affecting your behavior in any way at all.
Just sometimes though, the excuse generator will fail and send a signal back to the rest of the mind that it really needs to change something, else it will face social consequences.
The thing is, I don’t feel that this stuff is new. But try and point it out to anyone, and they will generate excuses as to why it doesn’t matter, or why everyone lacks the power of agency except them, or that it’s an interesting question they’ll get around to looking at sometime.
So currently I’m a bit stuck.
...act as predicted by the model.
I know this is a year or two late, but: I’ve noticed this and find it incredibly frustrating. Turning introspection (yes, I know) on my own internally-stated motivations more often than not reveals them to be either excuses or just plain bullshit. The most frequent failure mode is finding that I did , not because it was good, but because I wanted to be seen as the sort of person who would do it. Try though I might, it seems incredibly difficult to get my brain to not output Frankfurtian Bullshit.
I sort-of-intend to write a post about it one of these days.