This feels like a worse version of epicycles, in that even if it’s kind of useful, it seems like it definitely is not what’s going on. The idea of lying being difficult seems to (A) presuppose a consciousness, and (B) make no sense—it seems like it would be much cheaper to evolve better lie-hiding mechanisms than to evolve consciousness. “Cognitive dissonance is adaptive with respect to expensive gestures” seems to explain pretty much all of what this theory is trying to address, without being weirdly centered on lying.
This feels like a theory that has been heavily massaged to fit facts, in the sense that your prediction of how such an individual would act seems to rather conveniently match how we actually act, rather than clearly and obviously predicting how we would act.
If there were an elegant alternative to this model, that would be better. But I don’t know of any other model that tries to comprehensively explain mental conflict without handwaving through the difficult parts.
Part of the problem may be that I assumed people already agreed with most of this model’s premises. Looking at the comments, I see I was totally wrong and the average person here doesn’t believe in things like the signaling theory of consciousness (which I thought was almost-universal in the Less Wrong community). So I might backtrack and try to make a sequence out of this, where I present each premise in order and slowly try to justify them. Maybe if people already believed all the premises it would look more like a reasonable way to fit a few large parts together, and less like LOOK HERE ARE TWELVE MILLION HYPOTHESIZED SYSTEMS AND SUBSYSTEMS THAT WHEN YOU PUT THEM TOGETHER KIND OF PRODUCE SOMETHING LIKE OUR BRAINS.
I’m pretty sure I agree with the theory that consciousness is the result of a social modeling arms race (ie signalling is one of the things driving the evolution of consciousness), but I think that a sequence of posts would be good anyway. It’s good both to have a group of well explained articles on which to found one’s thinking as well as a good way to get new rationalists up to speed.
This feels like a worse version of epicycles, in that even if it’s kind of useful, it seems like it definitely is not what’s going on. The idea of lying being difficult seems to (A) presuppose a consciousness, and (B) make no sense—it seems like it would be much cheaper to evolve better lie-hiding mechanisms than to evolve consciousness. “Cognitive dissonance is adaptive with respect to expensive gestures” seems to explain pretty much all of what this theory is trying to address, without being weirdly centered on lying.
This feels like a theory that has been heavily massaged to fit facts, in the sense that your prediction of how such an individual would act seems to rather conveniently match how we actually act, rather than clearly and obviously predicting how we would act.
If there were an elegant alternative to this model, that would be better. But I don’t know of any other model that tries to comprehensively explain mental conflict without handwaving through the difficult parts.
Part of the problem may be that I assumed people already agreed with most of this model’s premises. Looking at the comments, I see I was totally wrong and the average person here doesn’t believe in things like the signaling theory of consciousness (which I thought was almost-universal in the Less Wrong community). So I might backtrack and try to make a sequence out of this, where I present each premise in order and slowly try to justify them. Maybe if people already believed all the premises it would look more like a reasonable way to fit a few large parts together, and less like LOOK HERE ARE TWELVE MILLION HYPOTHESIZED SYSTEMS AND SUBSYSTEMS THAT WHEN YOU PUT THEM TOGETHER KIND OF PRODUCE SOMETHING LIKE OUR BRAINS.
I’m pretty sure I agree with the theory that consciousness is the result of a social modeling arms race (ie signalling is one of the things driving the evolution of consciousness), but I think that a sequence of posts would be good anyway. It’s good both to have a group of well explained articles on which to found one’s thinking as well as a good way to get new rationalists up to speed.
I would be curious about such a sequence.