Earlier this week, Wikimedia executive director Sue Gardner explained how Wikipedia works (and sometimes doesn’t) in a Los Angeles Times op-ed:
Our weakest articles are those on obscure topics, where subtle bias and small mistakes can sometimes persist for months or even years. But Wikipedians are fierce guardians of quality, and they tend to challenge and remove bias and inaccuracy as soon as they see it.
The article on Barack Obama is a great example of this. Because it’s widely read and frequently edited, over the years it’s become comprehensive, objective and beautifully well sourced.
Using the Barack Obama article is cherry-picking, but it’s true: articles are generally as good as they have contributors for them. Yesterday the Times’ Letters section published a response from a (wait for it) high school teacher, arguing against taking Wikipedia seriously:
Why use Wikipedia when library databases such as Proquest and Opposing Viewpoints, which contain PDF files of peer-reviewed, scholarly articles, are available? When given a choice between an article written by an unknown Internet user and one written by an expert, shouldn’t the choice be obvious?
Wikipedia is the lazy researcher’s source of information. It’s useful for a quick answer to a trivia question or resolving a bet, but it should not be used for serious research.
I thought we stopped arguing about the content of Wikipedia as a source of information awhile back, with the standard reply “look to the sources used as references,” but apparently that hasn’t got around the school district yet.
The problem is that they’re both right as far as it goes, and we don’t really know how far that is.
Maybe what we need to figure out is: what’s the proportion of well-developed, well-cited articles to mediocre-to-worse articles covering important subjects, and how do we determine what that means and how to measure it? What this debate needs is some empirical data.