How many ways are there for a prediction market to go wrong?
In my story’s current draft, once my protagonist upload has made a few copies of himself, I have him start up a prediction market to try to improve his decision-making, such as the likelihood any given plan will reach a useful goal. (Using currency created ab nihilo via a Bitcoin-like blockchain.) I have his similar copies end up coming to an overconfident consensus, leading to an explosive disaster, leading to attempts to deliberately diversity his copies’ mindstates. Initially by simple psychological priming, later by more potentially dubious experimentation.
I only have an interested amateur’s understanding of prediction markets, and have nearly no conception of the complications involved, such as what design choices are available when founding one or how many ways one can go wrong (other than the particular item in the previous paragraph). If there’s anything you’d like to see in a story with a prediction market, or if you have any advice or references I can read up on, I’d appreciate your input.
(The idea I’m currently pondering: if it would be feasible to use a prediction market to generate answers to the question, “Which rights (or ‘rights’) should we include in our constitutional Bill of Rights, and which should we leave out?”)
It might make sense to read about augur in detail and the justification for their design choices.
It’s worth understanding how scoring of predictions work. Do you have a central authority? Do you have something like Augur’s reputation?
For abstract goals like reach a useful goal it’s quite important who actually puts forward the wording of a specific question.
How can repredictions be judged? Augur has Right/Wrong/Unclear or Immoral. Immoral is particularly interesting. Did a certain person die to fix the prediction market and thus the prediction is an assignation market and should be judged as immoral or isn’t it?
What are the mechanics of the crypto-currency? If you take Augur than some bets might be make in complicated currencies like Dai. That currency might crash because the there isn’t enough insurance to pay for changing price.
Prediction markets go wrong when there a high inflation in money because the prediction market requires locking up money for a given time. If the goal is somewhere in the future there can be a requirement to pay subventions to counteract the interest someone would earn on money.
How liquid is the prediction market and how much money can be gained by effecting the result by buying shares to manipulate the price?
Prediction markets are driven by money. If someone has more money than the rest of prediction market together, they can burn the money to support a wrong prediction. Why would they do that? Presumably doing this brings them more benefit than the money spent. I can imagine two situations:
a) The prediction markets are new, only a few people bet there, so there is not so much money there. You may have a business that would be endangered by prediction markets becoming popular (e.g. people pay you for providing the expert opinion). Thus you spend money hoping to discredit the very concept of the prediction market.
b) People start trusting prediction markets blindly. For example, if the market predicts that X will become a president, people will not even bother to vote for the competitors, because “what’s the point? they’re going to lose anyway”. In such case, X may be willing to burn a lot of money in order to convince people that he is going to become the president, because that will become a self-fulfilling prophecy, and his sponsors are willing to invest that money. For the sake of story, let’s make it dramatic: the person intends to become a dictator, nationalize a lot of stuff, and give it to his sponsors; this is why the sponsors are willing to invest insane amounts of money, because they bet on getting much more in return.
By the way, “prediction market” being right in general doesn’t mean that every answer will be correct. There may be answers where most people don’t have a clue, and those will attract much less votes (but still some, because people are irrational).
Two things come to mind, but they are possibly only tangentially related. First: the second half of McAfee’s economic analysis (available online) is devoted to market inefficiencies. Second: there’s a chapter in Jaynes book about group invariance, that is how a piece of information can leave the prediction of a set of agents unchanged. Might be relevant.
Prediction Markets Going Wrong?
How many ways are there for a prediction market to go wrong?
In my story’s current draft, once my protagonist upload has made a few copies of himself, I have him start up a prediction market to try to improve his decision-making, such as the likelihood any given plan will reach a useful goal. (Using currency created ab nihilo via a Bitcoin-like blockchain.) I have his similar copies end up coming to an overconfident consensus, leading to an explosive disaster, leading to attempts to deliberately diversity his copies’ mindstates. Initially by simple psychological priming, later by more potentially dubious experimentation.
I only have an interested amateur’s understanding of prediction markets, and have nearly no conception of the complications involved, such as what design choices are available when founding one or how many ways one can go wrong (other than the particular item in the previous paragraph). If there’s anything you’d like to see in a story with a prediction market, or if you have any advice or references I can read up on, I’d appreciate your input.
(The idea I’m currently pondering: if it would be feasible to use a prediction market to generate answers to the question, “Which rights (or ‘rights’) should we include in our constitutional Bill of Rights, and which should we leave out?”)
It might make sense to read about augur in detail and the justification for their design choices.
It’s worth understanding how scoring of predictions work. Do you have a central authority? Do you have something like Augur’s reputation?
For abstract goals like
reach a useful goal
it’s quite important who actually puts forward the wording of a specific question.How can repredictions be judged? Augur has Right/Wrong/Unclear or Immoral. Immoral is particularly interesting. Did a certain person die to fix the prediction market and thus the prediction is an assignation market and should be judged as immoral or isn’t it?
What are the mechanics of the crypto-currency? If you take Augur than some bets might be make in complicated currencies like Dai. That currency might crash because the there isn’t enough insurance to pay for changing price.
Prediction markets go wrong when there a high inflation in money because the prediction market requires locking up money for a given time. If the goal is somewhere in the future there can be a requirement to pay subventions to counteract the interest someone would earn on money. How liquid is the prediction market and how much money can be gained by effecting the result by buying shares to manipulate the price?
Prediction markets are driven by money. If someone has more money than the rest of prediction market together, they can burn the money to support a wrong prediction. Why would they do that? Presumably doing this brings them more benefit than the money spent. I can imagine two situations:
a) The prediction markets are new, only a few people bet there, so there is not so much money there. You may have a business that would be endangered by prediction markets becoming popular (e.g. people pay you for providing the expert opinion). Thus you spend money hoping to discredit the very concept of the prediction market.
b) People start trusting prediction markets blindly. For example, if the market predicts that X will become a president, people will not even bother to vote for the competitors, because “what’s the point? they’re going to lose anyway”. In such case, X may be willing to burn a lot of money in order to convince people that he is going to become the president, because that will become a self-fulfilling prophecy, and his sponsors are willing to invest that money. For the sake of story, let’s make it dramatic: the person intends to become a dictator, nationalize a lot of stuff, and give it to his sponsors; this is why the sponsors are willing to invest insane amounts of money, because they bet on getting much more in return.
By the way, “prediction market” being right in general doesn’t mean that every answer will be correct. There may be answers where most people don’t have a clue, and those will attract much less votes (but still some, because people are irrational).
Two things come to mind, but they are possibly only tangentially related.
First: the second half of McAfee’s economic analysis (available online) is devoted to market inefficiencies.
Second: there’s a chapter in Jaynes book about group invariance, that is how a piece of information can leave the prediction of a set of agents unchanged. Might be relevant.