I’m actually kinda amazed by how much I care about Karma.
I feel the same way, which seems to confirm something that occurred to me recently: the karma system is extremely important in that it determines the content, structure, and tone of posts and comments on LessWrong. It’s arguably the most important factor in determining how we communicate. If you care about karma, you’re going to say wildly different things that you would otherwise. How many times have you started typing a comment and then thought, “No, that’ll get downvoted.” How many times have you put a huge amount of effort into a comment or post to avoid downvoting? The point is, karma affects how we write and how we think, so it’s definitely a topic that deserves our attention.
On another note:
I’m surprised that no one (at the time of this writing) has mentioned karma’s most important function: it keeps trolling to a respectable minimum. The current karma system is remarkably efficient at keeping discussions civil and intelligent—it’s undoubtedly better than any other forum or blog I’ve seen. We should definitely keep this fact in mind when discussing potential changes.
I’m not sure how to measure that, but my brain seems to exhibit all those quirks associated with money but not tookens that are then exchanged for money in some experiments. At least that’s what a quick introspection says.
Edit; and it’s relevant that my brain does not seem to treat money that way.
How it relates to trolling is important, but I consider that a mostly “solved” problem. I don’t think the introduction of Quirrell Points would influence trolling one way or another. If we use it, it’s to accomplish/prevent something else.
What’s also interesting, though, is how optimizing for Karma (as is) can actually degrade post quality. When I found the Time Magazine article about the Singularity, I was like “sweet! I get to post a link and harvest free Karma without doing any work!” I’m not sure if I’ve actually consciously focused efforts on posting with a goal of “karma-for-minimal-effort” but an AI programmed to acquire Karma may well decide to do so.
If the karma system is working optimally, karma-for-minimal-effort should actually correspond to value-to-the-community-for-minimal-effort, which is a perfectly good thing to maxemize. If it’s not then the karma system simply needs tweaking.
I’m not entirely convinced that it’s a solved problem, but I don’t think that the introduction of Quirrell Points would have any influence on it. But, IMHO, the great strength of karma is that the community awards points, not just one person.
And I do agree that karma has the ability to degrade post quality—I’ve noticed that the first comment on a post will generally be upvoted the most (unless the comment is horribly wrong).
If you care about karma, you’re going to say wildly different things that you would otherwise. How many times have you started typing a comment and then thought, “No, that’ll get downvoted.” How many times have you put a huge amount of effort into a comment or post to avoid downvoting?
Never and never. My habitual way of writing fits pretty well into LW.
However, I’ve been know to stare at my point count while refreshing the page so that I can really see it if my point count has gone up, and I’m quite intrigued by the possibility of making the top ten list.
I should probably take that possibility into thinking about top level posts I could write.
How many times have you started typing a comment and then thought, “No, that’ll get downvoted.”
You are doing it consciously? I do not doubt that I am influenced by the karma system somehow (I certainly have the positive feelings when the score moves up), but to intentionally tailor the comments for purposes of karma gain seems wrong to me.
to intentionally tailor the comments for purposes of karma gain seems wrong to me.
Seems wrong why? Wrong how? That strikes me as an intuition worth analyzing.
We sometimes characterize what we are doing here as “sharing ideas”. But the word “sharing” is somewhat ambiguous. Are we sharing altruistically—providing others with something they need/want? Is it more of a social-bonding thing—similar to sharing of candy among schoolchildren, or sharing of juicy gossip or the latest joke?
Introspecting my own reasons for posting, I think that the primary motivations are approval-seeking and (sometimes) a desire for honest intellectual feedback regarding an idea I just had. If I tune the wording of a comment or posting for maximum karma, how am I subverting either of those motives?
As for my commenting, I can think of five basic classes of motivations:
Curiosity. This covers asking questions when I think others may have some insight or know something which I don’t know. Sometimes, curiosity may lead to try initiate a discussion about some topic, usually when I am unable to formulate a direct question, or don’t know what to ask.
Sharing information. If I think some idea is worth attention (or simply relevant and funny) and has not yet appeared in the thread, I may post it.
Exerting influence. I may try to convince the others to do something (e.g. code a meetup bar for the main site, stop talking to “trolls”...) so that LW becomes more suitable to my preferences. Convincing others about something may fall here or under the previous category.
Social games. This includes expressing thanks for good comments or posts, supporting or criticising positions, status grabbing by harvesting karma or displaying intelligence. Most of motivations belonging to the category 2 have a fair share of social signalling too.
Answering questions. (Edited to add; somehow I have overlooked the motivation of this very comment.)
I do consciously prefer the other types of motivation to the fourth one. One reason is that social signalling is already subconsciously the strongest motivator for participating in a discussion, and it may be preferable to compensate a bit. Standard debates all over the world are full of type-4 motivated contributions, and the results aren’t optimal.
That isn’t to say that we should omit social signalling at all, or that we shouldn’t take care of the formulation of a comment or a post. Only that the signalling purposes should play a secondary role to the other motivations—if I want to ask a question or present an insight, I shouldn’t refrain from doing so based on an expectation of being downvoted (there are situations when I should, but those are exceptions). Similarly, I shouldn’t post a comment if the only motivation is to get upvotes.
There is a difference between communicating in a way approved by the community, and trying to communicating in order to get approval. If the latter was the norm, evaporative cooling of our opinions would become a serious danger. I was alerted mainly by this sentence present in the grand-parent comment:
If you care about karma, you’re going to say wildly different things that you would otherwise. (emphasis mine)
Saying wildly different things for want of approval raises a particular red flag for me, and therefore I wanted a clarification. Not that I am innocent in this respect: although I respect people who are willing to sacrifice some karma when necessary, I can’t remember of posting something controversial myself. Which I am a little worried about.
How many times have you put a huge amount of effort into a comment or post to avoid downvoting? The point is, karma affects how we write and how we think, so it’s definitely a topic that deserves our attention.
I think the results show the system mostly works. I think there’s little evidence the karma system is, in fact, broken. If it ain’t broke, don’t fix it.
Agreed—I’m not suggesting an overhaul of the system, just that we should be mindful of how it affects our writing and thinking. I certainly don’t want to get rid of the such an effective anti-troll mechanism.
I feel the same way, which seems to confirm something that occurred to me recently: the karma system is extremely important in that it determines the content, structure, and tone of posts and comments on LessWrong. It’s arguably the most important factor in determining how we communicate. If you care about karma, you’re going to say wildly different things that you would otherwise. How many times have you started typing a comment and then thought, “No, that’ll get downvoted.” How many times have you put a huge amount of effort into a comment or post to avoid downvoting? The point is, karma affects how we write and how we think, so it’s definitely a topic that deserves our attention.
On another note: I’m surprised that no one (at the time of this writing) has mentioned karma’s most important function: it keeps trolling to a respectable minimum. The current karma system is remarkably efficient at keeping discussions civil and intelligent—it’s undoubtedly better than any other forum or blog I’ve seen. We should definitely keep this fact in mind when discussing potential changes.
I find it frightening how much I care about karma, actually.
How much do you care about karma?
I’m not sure how to measure that, but my brain seems to exhibit all those quirks associated with money but not tookens that are then exchanged for money in some experiments. At least that’s what a quick introspection says.
Edit; and it’s relevant that my brain does not seem to treat money that way.
How it relates to trolling is important, but I consider that a mostly “solved” problem. I don’t think the introduction of Quirrell Points would influence trolling one way or another. If we use it, it’s to accomplish/prevent something else.
What’s also interesting, though, is how optimizing for Karma (as is) can actually degrade post quality. When I found the Time Magazine article about the Singularity, I was like “sweet! I get to post a link and harvest free Karma without doing any work!” I’m not sure if I’ve actually consciously focused efforts on posting with a goal of “karma-for-minimal-effort” but an AI programmed to acquire Karma may well decide to do so.
If the karma system is working optimally, karma-for-minimal-effort should actually correspond to value-to-the-community-for-minimal-effort, which is a perfectly good thing to maxemize. If it’s not then the karma system simply needs tweaking.
I’m not entirely convinced that it’s a solved problem, but I don’t think that the introduction of Quirrell Points would have any influence on it. But, IMHO, the great strength of karma is that the community awards points, not just one person.
And I do agree that karma has the ability to degrade post quality—I’ve noticed that the first comment on a post will generally be upvoted the most (unless the comment is horribly wrong).
Never and never. My habitual way of writing fits pretty well into LW.
However, I’ve been know to stare at my point count while refreshing the page so that I can really see it if my point count has gone up, and I’m quite intrigued by the possibility of making the top ten list.
I should probably take that possibility into thinking about top level posts I could write.
You are doing it consciously? I do not doubt that I am influenced by the karma system somehow (I certainly have the positive feelings when the score moves up), but to intentionally tailor the comments for purposes of karma gain seems wrong to me.
Seems wrong why? Wrong how? That strikes me as an intuition worth analyzing.
We sometimes characterize what we are doing here as “sharing ideas”. But the word “sharing” is somewhat ambiguous. Are we sharing altruistically—providing others with something they need/want? Is it more of a social-bonding thing—similar to sharing of candy among schoolchildren, or sharing of juicy gossip or the latest joke?
Introspecting my own reasons for posting, I think that the primary motivations are approval-seeking and (sometimes) a desire for honest intellectual feedback regarding an idea I just had. If I tune the wording of a comment or posting for maximum karma, how am I subverting either of those motives?
As for my commenting, I can think of five basic classes of motivations:
Curiosity. This covers asking questions when I think others may have some insight or know something which I don’t know. Sometimes, curiosity may lead to try initiate a discussion about some topic, usually when I am unable to formulate a direct question, or don’t know what to ask.
Sharing information. If I think some idea is worth attention (or simply relevant and funny) and has not yet appeared in the thread, I may post it.
Exerting influence. I may try to convince the others to do something (e.g. code a meetup bar for the main site, stop talking to “trolls”...) so that LW becomes more suitable to my preferences. Convincing others about something may fall here or under the previous category.
Social games. This includes expressing thanks for good comments or posts, supporting or criticising positions, status grabbing by harvesting karma or displaying intelligence. Most of motivations belonging to the category 2 have a fair share of social signalling too.
Answering questions. (Edited to add; somehow I have overlooked the motivation of this very comment.)
I do consciously prefer the other types of motivation to the fourth one. One reason is that social signalling is already subconsciously the strongest motivator for participating in a discussion, and it may be preferable to compensate a bit. Standard debates all over the world are full of type-4 motivated contributions, and the results aren’t optimal.
That isn’t to say that we should omit social signalling at all, or that we shouldn’t take care of the formulation of a comment or a post. Only that the signalling purposes should play a secondary role to the other motivations—if I want to ask a question or present an insight, I shouldn’t refrain from doing so based on an expectation of being downvoted (there are situations when I should, but those are exceptions). Similarly, I shouldn’t post a comment if the only motivation is to get upvotes.
There is a difference between communicating in a way approved by the community, and trying to communicating in order to get approval. If the latter was the norm, evaporative cooling of our opinions would become a serious danger. I was alerted mainly by this sentence present in the grand-parent comment:
Saying wildly different things for want of approval raises a particular red flag for me, and therefore I wanted a clarification. Not that I am innocent in this respect: although I respect people who are willing to sacrifice some karma when necessary, I can’t remember of posting something controversial myself. Which I am a little worried about.
I think the results show the system mostly works. I think there’s little evidence the karma system is, in fact, broken. If it ain’t broke, don’t fix it.
Agreed—I’m not suggesting an overhaul of the system, just that we should be mindful of how it affects our writing and thinking. I certainly don’t want to get rid of the such an effective anti-troll mechanism.