I like this post a lot, it speaks to some of my concerns about this community and about the sorts of people I’d like to surround myself with. As an analytical/systematizing/whatever (I got a 35 on the test Roko posted a while back, interpret from that what you will), I felt very strange about all of these rhetorical games for most of my life. It was only when I discovered signalling theory in my study of economics that it started to make sense. If I frame my social interactions as signalling problems, the goals and the ways I should achieve them seem to become a lot clearer. It’s also a useful outsider perspective; I find that I can (occasionally) recognize what people are really trying to do and how better than them simply because they have a richer and more complicated view of human socializing.
While I agree with most everything you’ve posted in the abstract, assuming your whole goal really is to have a reasoned discussion to provide evidence on which to update your beliefs, I think we often misunderstand our own intentions. Although the most common counterargument to “you should be nicer in how you frame your responses” is “I don’t have time for all that fluff”, there is a whole lot more going on there. First there are the personal signaling goals; not just to demonstrate that you are clever, but also that you are the sort of person who is upfront and honest (something people value) and not overly concerned with status in the eyes of your peers. This second one is amusing, since the purpose of the signal is to raise your status with the people who value not valuing status too much. I think children acting “too cool” to try hard in school is a good example of this, among other things. Additionally, there is the enforcement of group norms, not all of which have to be in perfect harmony. For example, LW has a group norm of open discussion with an eye toward gathering information about and improving human rationality. However, we also have the norm of not suffering fools gladly (what would happen if some nut posted stuff about The Secret as a comment on the quantum physics sequence?). Oftentimes these two sync up, since we don’t want the discussion polluted by noisy nonsense, but sometimes they don’t; I’ve seen more than a few people with valuable ideas leave this community because of hostile treatment.
None of this is to say that people shouldn’t be nicer if they want to achieve their higher goals. That’s a principle I’ve tried to operate on, and I think it’s a good one. My point is that, just like the player who seems to play the ultimatum game irrationally by denying unfair offers, people who respond acrimoniously to their peers are often playing a game with a different goal, whether they know it or not. It may still be irrational, but its irrational in a complex and multifaceted way, and I doubt one post, or even a whole sprawling sequence of posts, would be enough to touch on all the complicated signals involved in this. The fact that the norm enforcement algorithm feels a lot more individualistic and noble from the inside than it looks from the outside makes the whole problem a lot worse.
I like this post a lot, it speaks to some of my concerns about this community and about the sorts of people I’d like to surround myself with. As an analytical/systematizing/whatever (I got a 35 on the test Roko posted a while back, interpret from that what you will), I felt very strange about all of these rhetorical games for most of my life. It was only when I discovered signalling theory in my study of economics that it started to make sense. If I frame my social interactions as signalling problems, the goals and the ways I should achieve them seem to become a lot clearer. It’s also a useful outsider perspective; I find that I can (occasionally) recognize what people are really trying to do and how better than them simply because they have a richer and more complicated view of human socializing.
While I agree with most everything you’ve posted in the abstract, assuming your whole goal really is to have a reasoned discussion to provide evidence on which to update your beliefs, I think we often misunderstand our own intentions. Although the most common counterargument to “you should be nicer in how you frame your responses” is “I don’t have time for all that fluff”, there is a whole lot more going on there. First there are the personal signaling goals; not just to demonstrate that you are clever, but also that you are the sort of person who is upfront and honest (something people value) and not overly concerned with status in the eyes of your peers. This second one is amusing, since the purpose of the signal is to raise your status with the people who value not valuing status too much. I think children acting “too cool” to try hard in school is a good example of this, among other things. Additionally, there is the enforcement of group norms, not all of which have to be in perfect harmony. For example, LW has a group norm of open discussion with an eye toward gathering information about and improving human rationality. However, we also have the norm of not suffering fools gladly (what would happen if some nut posted stuff about The Secret as a comment on the quantum physics sequence?). Oftentimes these two sync up, since we don’t want the discussion polluted by noisy nonsense, but sometimes they don’t; I’ve seen more than a few people with valuable ideas leave this community because of hostile treatment.
None of this is to say that people shouldn’t be nicer if they want to achieve their higher goals. That’s a principle I’ve tried to operate on, and I think it’s a good one. My point is that, just like the player who seems to play the ultimatum game irrationally by denying unfair offers, people who respond acrimoniously to their peers are often playing a game with a different goal, whether they know it or not. It may still be irrational, but its irrational in a complex and multifaceted way, and I doubt one post, or even a whole sprawling sequence of posts, would be enough to touch on all the complicated signals involved in this. The fact that the norm enforcement algorithm feels a lot more individualistic and noble from the inside than it looks from the outside makes the whole problem a lot worse.