A cautionary statement about betting on your beliefs from Tyler Cowen:
Bryan Caplan is pleased that he has won his bet with me, about whether unemployment will fall under five percent. … The Benthamite side of me will pay Bryan gladly, as I don’t think I’ve ever had a ten dollar expenditure of mine produce such a boost in the utility of another person.
That said, I think this episode is a good example of what is wrong with betting on ideas. Betting tends to lock people into positions, gets them rooting for one outcome over another, it makes the denouement of the bet about the relative status of the people in question, and it produces a celebratory mindset in the victor. That lowers the quality of dialogue and also introspection, just as political campaigns lower the quality of various ideas — too much emphasis on the candidates and the competition.
Yes. I can understand feeling locked in if you only make 1 bet every few years and it’s extremely high profile, and you make it part of your identity. But I can’t imagine feeling like that in any of my IEM or GJP trades (or even my PB predictions!), since I was taking positions in a number of markets and could regularly back off or take the other side when the price changed to something I disagreed with; there you are encouraged to disidentify with trades as much as possible and take an outside view where you’re just making one of many calibrated predictions.
This is definitely a flaw of rare high-stakes high-transaction-cost interpersonal betting: they’re good for calling ‘bullshit!’ but not so good for less charged broader aggregation and elicitation of views. This is something PB is good at, and a distributed prediction market might be even better at.
Seems like a problem that could be solved by making more bets.
The problem is not finding out “how good you actually are”. The problem is that making the bet locks you into a particular state of mind which involves more bias and less updating on evidence.
The problem is that making the bet locks you into a particular state of mind which involves more bias and less updating on evidence.
It’s not clear that this would be the case. Even if you’re making only a few bets at a time (as opposed to participating in a liquid market), there will always be some odds at which you’ll want to hedge the bet from the other side.
The natural question is if there’s a better betting scheme, one that would retain the compulsion to tell the truth but smooth the tribalism naturally present in the brain. For example, one could bet on both outcomes and pay the log of of the probability of the wrong outcome but receive the log-prob of the outcome that is realized. Has this kind of scheme been alread analyzed?
Not sure changing the payout schemes would help. The underlying issue which Tyler Cowen thinks is a problem is that making a bet freezes your position in time, so to say, and gives you a stake (if not monetary then a status stake) to defend. That does not depend on the details of how the bet is arranged. And you can’t go around it because getting some skin into the game is precisely the purpose of betting from Robin Hanson’s point of view.
That does not depend on the details of how the bet is arranged.
I would contest that’s the case insofar as you have to bet only on one side, if you gain / lose stakes from both positions, possibly the ” rooting for one outcome over another, it makes the denouement of the bet about the relative status of the people in question” would be diminished?
I don’t understand. At resolution time the event will have a single outcome. That single event outcome will lead to a single bet outcome. You can have complicated payout schemes, but after netting the outcome will be a single fixed number.
A cautionary statement about betting on your beliefs from Tyler Cowen:
Seems like a problem that could be solved by making more bets.
If you only make one bet, you have either 0% or 100% success rate, and neither reflects how good you actually are.
Yes. I can understand feeling locked in if you only make 1 bet every few years and it’s extremely high profile, and you make it part of your identity. But I can’t imagine feeling like that in any of my IEM or GJP trades (or even my PB predictions!), since I was taking positions in a number of markets and could regularly back off or take the other side when the price changed to something I disagreed with; there you are encouraged to disidentify with trades as much as possible and take an outside view where you’re just making one of many calibrated predictions.
This is definitely a flaw of rare high-stakes high-transaction-cost interpersonal betting: they’re good for calling ‘bullshit!’ but not so good for less charged broader aggregation and elicitation of views. This is something PB is good at, and a distributed prediction market might be even better at.
The problem is not finding out “how good you actually are”. The problem is that making the bet locks you into a particular state of mind which involves more bias and less updating on evidence.
But still if you do a lot of small betting instead of few large there you have less chance to lock yourself in a hole.
It’s not clear that this would be the case. Even if you’re making only a few bets at a time (as opposed to participating in a liquid market), there will always be some odds at which you’ll want to hedge the bet from the other side.
I think Robin Hanson has a pretty good response to this.
It has nearly the opposite effects for ideas I haven’t yet bet on but might feel tempted or obligated to bet on.
The bad effects are weaker if I can get out of the bet easily (as is the case on a high-volume prediction market).
A counterpoint to Tyler Cower from Bryan Caplan who won that particular bet.
The natural question is if there’s a better betting scheme, one that would retain the compulsion to tell the truth but smooth the tribalism naturally present in the brain.
For example, one could bet on both outcomes and pay the log of of the probability of the wrong outcome but receive the log-prob of the outcome that is realized. Has this kind of scheme been alread analyzed?
Not sure changing the payout schemes would help. The underlying issue which Tyler Cowen thinks is a problem is that making a bet freezes your position in time, so to say, and gives you a stake (if not monetary then a status stake) to defend. That does not depend on the details of how the bet is arranged. And you can’t go around it because getting some skin into the game is precisely the purpose of betting from Robin Hanson’s point of view.
I would contest that’s the case insofar as you have to bet only on one side, if you gain / lose stakes from both positions, possibly the ” rooting for one outcome over another, it makes the denouement of the bet about the relative status of the people in question” would be diminished?
I don’t understand. At resolution time the event will have a single outcome. That single event outcome will lead to a single bet outcome. You can have complicated payout schemes, but after netting the outcome will be a single fixed number.