Exactly. Here is an excellent article elaborating further. (Only quibble is that is was not just Silver; other data-based analysts like Sam Wang and Josh Putnam made essentially the same predictions):
When we talk about the epistemology of journalism, it all eventually ties into objectivity. The journalistic norm of objectivity is more than just a careful neutrality or attempt to appear unbiased; for journalists, it’s the grounds on which they claim the authority to describe reality to us. And the authority of objectivity is rooted in a particular process.
That process is very roughly this: Journalists get access to privileged information from official sources, then evaluate, filter, and order it through the rather ineffable quality alternatively known as “news judgment,” “news sense,” or “savvy.” This norm of objectivity is how political journalists say to the public (and to themselves), “This is why you can trust what we say we know — because we found it out through this process.” (This is far from a new observation – there are decades of sociological research on this.)
Silver’s process — his epistemology — is almost exactly the opposite of this:
Where political journalists’ information is privileged, his is public, coming from poll results that all the rest of us see, too.
Where political journalists’ information is evaluated through a subjective and nebulous professional/cultural sense of judgment, his evaluation is systematic and scientifically based. It involves judgment, too, but because it’s based in a scientific process, we can trace how he applied that judgment to reach his conclusions.
(…)
Joe Scarborough gets us even closer to the clash between processes of knowing when he tells Byers, “Nate Silver says this is a 73.6 percent chance that the president is going to win? Nobody in that campaign thinks they have a 73 percent chance — they think they have a 50.1 percent chance of winning. And you talk to the Romney people, it’s the same thing.” How does Scarborough know that Silver’s estimate is incorrect? He talked to sources in both campaigns. In Scarborough’s journalistic epistemology, this is the trump card: Silver’s methods cannot possibly produce more reliable information than the official sources themselves. These are the savviest, highest inside sources. They are the strongest form of epistemological proof — a “case closed” in an argument against calculations and numbers.
The other objection political journalists/pundits have to Silver’s process is evident here, too. They don’t just have a problem with how he knows what he knows, but with how he states it, too. Essentially, they are mistaking specificity for certainty. To them, the specificity of Silver’s projections smack of arrogance because, again, their ways of knowing are incapable of producing that kind of specificity. It has to be an overstatement.
In actuality, of course, Silver’s specificity isn’t arrogance at all — it’s the natural product of a scientific, statistical way of producing knowledge. Statistical analyses produce specific numbers by their very nature. That doesn’t mean they’re certain: In fact, the epistemology of social science has long been far more tentative in reaching conclusions than the epistemology of journalism. As many people have noted over the past few days, a probability is not a prediction.
Exactly. Here is an excellent article elaborating further. (Only quibble is that is was not just Silver; other data-based analysts like Sam Wang and Josh Putnam made essentially the same predictions):
A dataset including Wang & Putnam, with scoring of accuracy:
http://www.gwern.net/2012%20election%20predictions
http://appliedrationality.org/2012/11/09/was-nate-silver-the-most-accurate-2012-election-pundit/