Wikipedia’s Epistemology—How Wikipedia determines truth. I’ll let David Gerard tell us what that was about
Um, OK. This is an inchoate thing I’ve been bouncing around in my head for about the past six months. To attempt to summarise …
Normally, it’s a four-year liberal arts degree to learn the subtle arts of weighing up unreliable human-generated evidence and turning it into useful information. The way Wikipedia works means that you have to explain all that from scratch to argumentative teenagers with Wikipedia-induced aspergism in three paragraphs, and they’ll still argue it, ’cos it’s not like there’s people who really do know more than them about abstracting knowledge from data, is it.
This means that Wikipedia has evolved its own epistemology of where knowledge comes from. It means there’s a massive systemic bias against fields that aren’t favoured by people who don’t think like that. It also generates absurdities like regarding newspapers as “reliable sources”, which anyone who’s ever been quoted in one will laugh hysterically in horror at.
This is treated as though it is not just one epistemology of many, but the epistemology of how to abstract truth for an encyclopedia.
This is enough of a problem that I know humanities scholars who know Wikipedia in depth but are having to work out what the hell they can do about this, as academic experts in various fields start bringing themselves to Wikipedia even if it gets idiots in their faces, just to get their field properly represented.
A further problem is that early Wikipedians were encyclopedia nerds who could answer “What’s an encyclopedia?” by pointing to Britannica and saying “It’s a bit like that.” There are kids now who have never had any other encyclopedia than Wikipedia. So “what is an encyclopedia?” is coming loose from history. This may be good or bad. I suspect it’s bad but would be willing to be convinced it wasn’t.
The above needs work and, the hard bit, proposed solutions. That last is what I’ve been stuck on.
Normally, it’s a four-year liberal arts degree to learn the subtle arts of weighing up unreliable human-generated evidence and turning it into useful information
I’ve never heard of that being taught in college. Is there a Bayesian stats class involved? Could these alleged evidence-weighers combine two likelihood ratios with a prior?
I mean, I’m sorry, but the above is just a ridiculous assertion. If there were any four-year university degree which taught people how to weigh evidence correctly, the world would look very different from the way it currently does.
David_Gerard’s reasoning seems to me to depend less on the assertion that a four-year liberal arts degree is sufficient to extract truth from human-generated evidence by some set of external standards, and more on the assertion that however flawed the liberal-arts methodology is, WP:* generates some unique and serious issues of its own.
Um, OK. This is an inchoate thing I’ve been bouncing around in my head for about the past six months. To attempt to summarise …
Normally, it’s a four-year liberal arts degree to learn the subtle arts of weighing up unreliable human-generated evidence and turning it into useful information. The way Wikipedia works means that you have to explain all that from scratch to argumentative teenagers with Wikipedia-induced aspergism in three paragraphs, and they’ll still argue it, ’cos it’s not like there’s people who really do know more than them about abstracting knowledge from data, is it.
This means that Wikipedia has evolved its own epistemology of where knowledge comes from. It means there’s a massive systemic bias against fields that aren’t favoured by people who don’t think like that. It also generates absurdities like regarding newspapers as “reliable sources”, which anyone who’s ever been quoted in one will laugh hysterically in horror at.
This is treated as though it is not just one epistemology of many, but the epistemology of how to abstract truth for an encyclopedia.
This is enough of a problem that I know humanities scholars who know Wikipedia in depth but are having to work out what the hell they can do about this, as academic experts in various fields start bringing themselves to Wikipedia even if it gets idiots in their faces, just to get their field properly represented.
A further problem is that early Wikipedians were encyclopedia nerds who could answer “What’s an encyclopedia?” by pointing to Britannica and saying “It’s a bit like that.” There are kids now who have never had any other encyclopedia than Wikipedia. So “what is an encyclopedia?” is coming loose from history. This may be good or bad. I suspect it’s bad but would be willing to be convinced it wasn’t.
The above needs work and, the hard bit, proposed solutions. That last is what I’ve been stuck on.
I’ve never heard of that being taught in college. Is there a Bayesian stats class involved? Could these alleged evidence-weighers combine two likelihood ratios with a prior?
I mean, I’m sorry, but the above is just a ridiculous assertion. If there were any four-year university degree which taught people how to weigh evidence correctly, the world would look very different from the way it currently does.
David_Gerard’s reasoning seems to me to depend less on the assertion that a four-year liberal arts degree is sufficient to extract truth from human-generated evidence by some set of external standards, and more on the assertion that however flawed the liberal-arts methodology is, WP:* generates some unique and serious issues of its own.
That seems pretty reasonable to me.
Pretty much. The Wikipedia method is actually worse.
Can you give some examples of the problem?