Implicit in a 75% probability of X is a 25% probability of not-X
This may strike everyone as obvious...
My experience with the GJP suggests that it’s not. Some people there, for instance, are on record as assigning a 75% probability to the proposition “The number of registered Syrian conflict refugees reported by the UNHCR will exceed 250,000 at any point before 1 April 2013”.
Currently this number is 242,000, the trend in the past few months has been an increase of 1000 to 2000 a day, and the UNHCR have recently provided estimates that this number will eventually reach 700,000. This was clear as early as August. The kicker is that the 242K number is only the count of people who are fully processed by the UNHCR administration and officially in their database; there are tens of thousands more in the camp who only have “appointments to be registered”.
It’s hard for me to understand why people are not updating to, maybe not 100%, but at least 99%, and that these are the only answers worth considering. To state your probability as 85% or 91% (as some have quite recently) is to say, “There is a one in ten chance that the Syrian conflict will suddenly stop and all these people will go home, all in the next few days before the count goes over.”
This is kind of like saying “There is a one in ten chance Santa Claus will be the one distributing the presents this year.”
It’s really, really weird that in a contest aimed at people who understand the notion of probability and calibration, people presumed to be would-be rationalists, you’d get this kind of “Clack”.
I can only speculate as to what’s going on there, but I think it must be along the following lines: queried for a probability, people are translating something like “Sure, it’s gonna happen” into a biggish number, and reporting that. They are totally failing to flip the question around and visualize what would have to happen to make it true. (Perhaps, too, people have been so strongly cautioned by Tetlock’s writing against being overconfident that they reflexively shy away from the extreme numbers.)
My experience there casts some doubt on the statement “Probabilistic thinking is a remedy (...) so you’re hopefully automatically considering more than one world.”
At the very least, we must make a distinction between “express your beliefs in numerical terms and label these numbers ‘probabilities’” on the one hand, and “actually organize your thinking so as to respect the axioms of probability” on the other. Just because you use “75%” as a shorthand for “I’m pretty sure” doesn’t mean you are thinking probabilistically; you must train the skill of seeing that for some events “25%” also counts as “I’m pretty sure”.
My experience with the GJP suggests that it’s not. Some people there, for instance, are on record as assigning a 75% probability to the proposition “The number of registered Syrian conflict refugees reported by the UNHCR will exceed 250,000 at any point before 1 April 2013”.
I am a registered participant in one of the Good Judgement Project teams. I have literally no idea what my estimates of the probabilities are for quite a few of the events for which I have ‘current’ predictions. Depending on what you mean by ‘some people’, you might just be picking up on the fact that some people just don’t care as much about the accuracy of their predictions on GJP as you do.
some people just don’t care as much about the accuracy of their predictions on GJP
Agreed. Insofar as GJP is a contest, and the objective is to win, my remarks should be read with the implied proviso “assuming you care about winning”. In the prelude to the post where I discuss my GJP participation in more detail I used an analogy with playing Poker. I acknowledge that some people play Poker for the thrill of the game, and don’t actually mind losing their money—and there are variable levels of motivation all the way up to dedicated players.
This may strike everyone as obvious...
My experience with the GJP suggests that it’s not. Some people there, for instance, are on record as assigning a 75% probability to the proposition “The number of registered Syrian conflict refugees reported by the UNHCR will exceed 250,000 at any point before 1 April 2013”.
Currently this number is 242,000, the trend in the past few months has been an increase of 1000 to 2000 a day, and the UNHCR have recently provided estimates that this number will eventually reach 700,000. This was clear as early as August. The kicker is that the 242K number is only the count of people who are fully processed by the UNHCR administration and officially in their database; there are tens of thousands more in the camp who only have “appointments to be registered”.
It’s hard for me to understand why people are not updating to, maybe not 100%, but at least 99%, and that these are the only answers worth considering. To state your probability as 85% or 91% (as some have quite recently) is to say, “There is a one in ten chance that the Syrian conflict will suddenly stop and all these people will go home, all in the next few days before the count goes over.”
This is kind of like saying “There is a one in ten chance Santa Claus will be the one distributing the presents this year.”
It’s really, really weird that in a contest aimed at people who understand the notion of probability and calibration, people presumed to be would-be rationalists, you’d get this kind of “Clack”.
I can only speculate as to what’s going on there, but I think it must be along the following lines: queried for a probability, people are translating something like “Sure, it’s gonna happen” into a biggish number, and reporting that. They are totally failing to flip the question around and visualize what would have to happen to make it true. (Perhaps, too, people have been so strongly cautioned by Tetlock’s writing against being overconfident that they reflexively shy away from the extreme numbers.)
My experience there casts some doubt on the statement “Probabilistic thinking is a remedy (...) so you’re hopefully automatically considering more than one world.”
At the very least, we must make a distinction between “express your beliefs in numerical terms and label these numbers ‘probabilities’” on the one hand, and “actually organize your thinking so as to respect the axioms of probability” on the other. Just because you use “75%” as a shorthand for “I’m pretty sure” doesn’t mean you are thinking probabilistically; you must train the skill of seeing that for some events “25%” also counts as “I’m pretty sure”.
I am a registered participant in one of the Good Judgement Project teams. I have literally no idea what my estimates of the probabilities are for quite a few of the events for which I have ‘current’ predictions. Depending on what you mean by ‘some people’, you might just be picking up on the fact that some people just don’t care as much about the accuracy of their predictions on GJP as you do.
Agreed. Insofar as GJP is a contest, and the objective is to win, my remarks should be read with the implied proviso “assuming you care about winning”. In the prelude to the post where I discuss my GJP participation in more detail I used an analogy with playing Poker. I acknowledge that some people play Poker for the thrill of the game, and don’t actually mind losing their money—and there are variable levels of motivation all the way up to dedicated players.
I think you are entirely right, that people don’t visualize.
I think you are 75% right.
Let’s do 1000 trials and see if it converges, verify that p<0.05, write a paper and publish.