In particular, one should be skeptical of having lots of people who consistently do worse than average.
I think, though, that it would, in fact, be worthwhile to do the analysis combining 2008 and 2010. I think Paul Krugman had already started panicking by then.
More interesting might be to see how much data it takes for prediction markets to beat most/all pundits.
I would expect Krugman to suffer penalties over the last few years; I don’t read him very much, but he seems to have gotten much more partisan and inaccurate as time passes.
In particular, one should be skeptical of having lots of people who consistently do worse than average.
Outliers? That’s actually what I would expect. People with superior prediction skills can become significantly positive. The same people could use their information backwards to become significantly negative but it is damn hard to reliably lose to a vaguely efficient market significantly if you are stupid (or uninformed).
Sorry, I should have said “worse than random”. To do worse than random, one would have to take a source of good predictions and twist it into a source of bad ones. The only plausible explanation I could think of for this is that you know a group of people who are good at predicting and habitually disagree with them. It seems like there should be far less such people than there are legitimate good predictors.
It’s easy to lose to an efficient market if you’re not playing the efficient market’s games. If you take your stated probability and the market’s implied probability and make a bet somewhere in between, you are likely to lose money over time.
We are in complete agreement, and I should have been explicit and said I was refining a detail on an approximately valid point!
It seems like there should be far less such people than there are legitimate good predictors.
And it seems like those that do exist should have less money to be betting on markets! If not then it would seem like the other group is making some darn poor strategic predictions regarding the rest of their life choices!
It’s easy to lose to an efficient market if you’re not playing the efficient market’s games.
Yes, like it is easy for a thief to get all my jewelry if I break into his house and put it on the table. Which I suppose is the sort of thing they do on Burn Notice to frame the bad guys for crimes. Which makes me wonder if it would be possible to frame someone for, say, insider trading or industrial espionage by losing money to someone such that their windfall is suspicious.
that you know a group of people who are good at predicting and habitually disagree with them.
It seems to me that this is exactly the sort of thing that can really happen in politics. Suppose you have two political parties, the Greens and the Blues, and that for historical reasons it happens that the Greens have adopted some ways of thinking that actually work well, and the Blues make it their practice to disagree with everything distinctive that the Greens say.
(And it could easily happen that there are more Blues than Greens, in which case you’d get lots of systematically bad predictors.)
In particular, one should be skeptical of having lots of people who consistently do worse than average.
I think, though, that it would, in fact, be worthwhile to do the analysis combining 2008 and 2010. I think Paul Krugman had already started panicking by then.
More interesting might be to see how much data it takes for prediction markets to beat most/all pundits.
I would expect Krugman to suffer penalties over the last few years; I don’t read him very much, but he seems to have gotten much more partisan and inaccurate as time passes.
Outliers? That’s actually what I would expect. People with superior prediction skills can become significantly positive. The same people could use their information backwards to become significantly negative but it is damn hard to reliably lose to a vaguely efficient market significantly if you are stupid (or uninformed).
Sorry, I should have said “worse than random”. To do worse than random, one would have to take a source of good predictions and twist it into a source of bad ones. The only plausible explanation I could think of for this is that you know a group of people who are good at predicting and habitually disagree with them. It seems like there should be far less such people than there are legitimate good predictors.
It’s easy to lose to an efficient market if you’re not playing the efficient market’s games. If you take your stated probability and the market’s implied probability and make a bet somewhere in between, you are likely to lose money over time.
We are in complete agreement, and I should have been explicit and said I was refining a detail on an approximately valid point!
And it seems like those that do exist should have less money to be betting on markets! If not then it would seem like the other group is making some darn poor strategic predictions regarding the rest of their life choices!
Yes, like it is easy for a thief to get all my jewelry if I break into his house and put it on the table. Which I suppose is the sort of thing they do on Burn Notice to frame the bad guys for crimes. Which makes me wonder if it would be possible to frame someone for, say, insider trading or industrial espionage by losing money to someone such that their windfall is suspicious.
My point is that you’re losing in a context of prediction accuracy, not losing money.
It seems to me that this is exactly the sort of thing that can really happen in politics. Suppose you have two political parties, the Greens and the Blues, and that for historical reasons it happens that the Greens have adopted some ways of thinking that actually work well, and the Blues make it their practice to disagree with everything distinctive that the Greens say.
(And it could easily happen that there are more Blues than Greens, in which case you’d get lots of systematically bad predictors.)