Imagine Reddit + Prediction market. Instead of betting/winning money, you get enhanced karma and increased posting/commenting weight.
If you predict something successfully, say number of COVID deaths for the week, your posts and votes would have more weight than the people who failed to correctly predict the number of deaths.
Possible abuse: making many good predictions in order to accumulate points that you later sacrifice in making an intentionally incorrect expert opinion about your pet topic.
You could even do it without sacrificing the accumulated points by making outrageous conditional predictions with conditions you believe are not going to happen; e.g. “if Elon Musk succeeds to get his rocket on Mars in 2021, the government will stop vaccinating people against COVID (because all the powerful people will abandon Earth, and ignore the suffering of the poor)”—a safe prediction to make if you believe Elon Musk will not get to Mars in 2021.
Utopia isn’t an option, but that aside....I would argue that it is still better today where people can consistently be wrong and still get to consistently give incorrect opinions. And everyone else in the market will still learn along the way which is an improvement over current state.
I don’t think there would be much of a market for ridiculous conditional like that so not too worried about that.
By the way, Stack Exchange (which is not a prediction market, but still a mechanism that tracks reputation for people who give good answers) tracks the score separately per area of expertise, so e.g. correctly answering 100 questions on software development will not make you an expert on parenting, and vice versa. So this a possible approach to check for people who are “experts in X, crackpots in Y”.
Which again is not perfect, for example correctly answering 100 question on Java will automatically make you an expert on PHP. Perhaps the area of expertise could be defined more fluently, e.g. using tags, and displaying how much the person is an expert on given tags. (Which again raises a question of how the tags are assigned, especially if assigning the tag is part of the controversy...)
But yes, even “farming karma at X, burning it at Y” is preferable to current system that is without any tracking whatsoever (except for maybe “this person has a diploma” or “this person works for a newspaper” which gives them unlimited karma to burn).
I’ve been thinking about this a lot.
Imagine Reddit + Prediction market. Instead of betting/winning money, you get enhanced karma and increased posting/commenting weight.
If you predict something successfully, say number of COVID deaths for the week, your posts and votes would have more weight than the people who failed to correctly predict the number of deaths.
Possible abuse: making many good predictions in order to accumulate points that you later sacrifice in making an intentionally incorrect expert opinion about your pet topic.
You could even do it without sacrificing the accumulated points by making outrageous conditional predictions with conditions you believe are not going to happen; e.g. “if Elon Musk succeeds to get his rocket on Mars in 2021, the government will stop vaccinating people against COVID (because all the powerful people will abandon Earth, and ignore the suffering of the poor)”—a safe prediction to make if you believe Elon Musk will not get to Mars in 2021.
Utopia isn’t an option, but that aside....I would argue that it is still better today where people can consistently be wrong and still get to consistently give incorrect opinions. And everyone else in the market will still learn along the way which is an improvement over current state.
I don’t think there would be much of a market for ridiculous conditional like that so not too worried about that.
Yes.
By the way, Stack Exchange (which is not a prediction market, but still a mechanism that tracks reputation for people who give good answers) tracks the score separately per area of expertise, so e.g. correctly answering 100 questions on software development will not make you an expert on parenting, and vice versa. So this a possible approach to check for people who are “experts in X, crackpots in Y”.
Which again is not perfect, for example correctly answering 100 question on Java will automatically make you an expert on PHP. Perhaps the area of expertise could be defined more fluently, e.g. using tags, and displaying how much the person is an expert on given tags. (Which again raises a question of how the tags are assigned, especially if assigning the tag is part of the controversy...)
But yes, even “farming karma at X, burning it at Y” is preferable to current system that is without any tracking whatsoever (except for maybe “this person has a diploma” or “this person works for a newspaper” which gives them unlimited karma to burn).