Sometimes we talk about unnecessarily complex potential karma/upvote systems, so I thought I would throw out an idea along those lines:
Every time you post, you’re prompted to predict the upvote/downvote ratio of your post.
Instead of being scored on raw upvotes, you’re scored on something more like how accurately you predicted the future upvote/downvote ratio.
So if you write a good post that you expect to be upvoted, then you predict a high upvote/downvote ratio, and if you’re well calibrated to your audience, then you actually achieve the ratio you predicted, and you’re rewarded “extra” by the system.
And here’s the cool part. If you write a lazy low-effort post, or if you’re trolling, or you write any kind of post that you expect to be poorly received, then you have two options. You can either lie about the expected upvote/downvote ratio, input a high expected ratio, and then the system penalizes you even more when you turn out to get a low u/d ratio, and considers you to be a poorly calibrated poster. Or you can be honest about the u/d ratio you expect, in which case the system can just preemptively tell you not to bother posting stuff like that, or hide it, or penalize it in some other way.
Overall you end up with a system that rewards users who (1) are well-calibrated regarding the quality of their posts and (2) refrain from posting content they know to be bad by explicitly making them admit that it’s bad before they post it and also maybe hiding the content.
That’s true. But there are few circumstances that would warrant posting a comment that you know most people in your community will think is bad.
If you want to say something you expect to be unpopular, you can almost always phrase it in away that contextualizes why you are saying it, and urges people to consider the extenuating context before downvoting. If you don’t do this, then you’re just doing exactly what you shouldn’t be doing, if your goal was to make some kind of change.
edit: Another possibility would be this: instead of suppressing posts that you have predicted to be poorly received, the system simply forces you to sit on them for an hour or so before posting. This should reduce the odds that you are writing something in the heat of the moment and increase the relative odds that your probably-controversial post is actually valuable.
I get the feeling that LW has a lot of lurkers with interesting things to say, but who are too afraid to say them. They may eventually build up the courage they need to contribute to the community, but this system would scare them off. They don’t yet have enough data to predict how well their posts would be received. We need to be doing the opposite and remove some of the barriers to joining in.
On the other hand, trolls don’t care that much about karma. They’ll just exploit sock puppets.
Yeah, LW would probably not be the place to try this. I would guess that most potential karma systems only truly function correctly with a sufficient population of users, a sufficient number of people reading each post. LW has atrophied too much for this.
The thing is that without downvotes, there aren’t actually that many barriers to joining in. If someone has a problem with something you say, they have to actually say so, instead of just downvoting, which is what often happens on Reddit. And I think this is better because it forces negative reward to be associated with feedback, so that people who either have misunderstandings or are poor articulators of their views can get better over time. The worst thing is getting downvoted without knowing why. I don’t know if this has been tried anywhere, but maybe a system where every vote would necessitate a comment would work better, so that why a remark was received a particular way by the community would be well understood.
Sometimes we talk about unnecessarily complex potential karma/upvote systems, so I thought I would throw out an idea along those lines:
Every time you post, you’re prompted to predict the upvote/downvote ratio of your post.
Instead of being scored on raw upvotes, you’re scored on something more like how accurately you predicted the future upvote/downvote ratio.
So if you write a good post that you expect to be upvoted, then you predict a high upvote/downvote ratio, and if you’re well calibrated to your audience, then you actually achieve the ratio you predicted, and you’re rewarded “extra” by the system.
And here’s the cool part. If you write a lazy low-effort post, or if you’re trolling, or you write any kind of post that you expect to be poorly received, then you have two options. You can either lie about the expected upvote/downvote ratio, input a high expected ratio, and then the system penalizes you even more when you turn out to get a low u/d ratio, and considers you to be a poorly calibrated poster. Or you can be honest about the u/d ratio you expect, in which case the system can just preemptively tell you not to bother posting stuff like that, or hide it, or penalize it in some other way.
Overall you end up with a system that rewards users who (1) are well-calibrated regarding the quality of their posts and (2) refrain from posting content they know to be bad by explicitly making them admit that it’s bad before they post it and also maybe hiding the content.
Knowing that your post will get a low score is not equivalent to knowing that it is bad.
That’s true. But there are few circumstances that would warrant posting a comment that you know most people in your community will think is bad.
If you want to say something you expect to be unpopular, you can almost always phrase it in away that contextualizes why you are saying it, and urges people to consider the extenuating context before downvoting. If you don’t do this, then you’re just doing exactly what you shouldn’t be doing, if your goal was to make some kind of change.
edit: Another possibility would be this: instead of suppressing posts that you have predicted to be poorly received, the system simply forces you to sit on them for an hour or so before posting. This should reduce the odds that you are writing something in the heat of the moment and increase the relative odds that your probably-controversial post is actually valuable.
I get the feeling that LW has a lot of lurkers with interesting things to say, but who are too afraid to say them. They may eventually build up the courage they need to contribute to the community, but this system would scare them off. They don’t yet have enough data to predict how well their posts would be received. We need to be doing the opposite and remove some of the barriers to joining in.
On the other hand, trolls don’t care that much about karma. They’ll just exploit sock puppets.
Yeah, LW would probably not be the place to try this. I would guess that most potential karma systems only truly function correctly with a sufficient population of users, a sufficient number of people reading each post. LW has atrophied too much for this.
I really like the idea, but agree that it is sadly not the right thing here. It would be a fun addition to an Arbital-like site.
The thing is that without downvotes, there aren’t actually that many barriers to joining in. If someone has a problem with something you say, they have to actually say so, instead of just downvoting, which is what often happens on Reddit. And I think this is better because it forces negative reward to be associated with feedback, so that people who either have misunderstandings or are poor articulators of their views can get better over time. The worst thing is getting downvoted without knowing why. I don’t know if this has been tried anywhere, but maybe a system where every vote would necessitate a comment would work better, so that why a remark was received a particular way by the community would be well understood.