William Beutler on Wikipedia

Posts Tagged ‘Truth’

The Agony and Ecstasy of Wikidata

Tagged as , , , , , , , , ,
on April 12, 2012 at 8:31 am

Although Wikipedia is by far the best-known of the Wikimedia collaborative projects, it is just one of many. Just this last week, Wikimedia Deutschland announced its latest contribution: Wikidata (also @Wikidata, and see this interview in the Wikipedia Signpost). Still under development, its temporary homepage announces:

Wikidata aims to create a free knowledge base about the world that can be read and edited by humans and machines alike. It will provide data in all the languages of the Wikimedia projects, and allow for the central access to data in a similar vein as Wikimedia Commons does for multimedia files. Wikidata is proposed as a new Wikimedia hosted and maintained project.

Possible Wikidata logo

One of a few Wikidata logos under consideration.

Upon its announcement, I tweeted my initial impression, that it sounded like Wikipedia’s answer to Wolfram Alpha, the commercial “answer engine” created by Stephen Wolfram in 2009. It seems to partly be that but also more, and its apparent ambition—not to mention the speculation surrounding it—is causing a stir.

Already touted by TechCrunch as “Wikipedia’s next big thing” (incorrectly identifying Wikipedia as its primary driver, I pedantically note), Wikidata will create a central database for the countless numbers, statistics and figures currently found in Wikipedia’s articles. The centralized collection of data will allow for quick updates and uniformity of statistical information across Wikipedia.

Currently when new information replaces old, as is the case with census surveys, elections results and quarterly reports are published, Wikipedians must manually update the old data in all the articles in which it appears, across every language. Wikidata would create the possibility for a quick computer led update to replace all out of date information. Additionally, it is expected that Wikidata will allow visitors to search and access information in a less labor-intensive method. As TechCrunch suggests:

Wikidata will also enable users to ask different types of questions, like which of the world’s ten largest cities have a female mayor?, for example. Queries like this are today answered by user-created Wikipedia Lists – that is, manually created structured answers. Wikidata, on the hand, will be able to create these lists automatically.

Though this project—which is funded by the Allen Institute for Artificial Intelligence, the Gordon and Betty Moore Foundation, and Google—is expected to take about a year to develop, but the blogosphere is already buzzing.

It’s probably fair to say that the overall response has been very positive. In a long post summarizing Wikidata’s aims, Yahoo! Labs researcher Nicolas Torzec identifies himself as one who excitedly awaits the changes Wikidata promises:

By providing and integrating Wikipedia with one common source of structured data that anyone can edit and use, Wikidata should enable higher consistency and quality within Wikipedia articles, increase the availability of information in and across Wikipedias, and decrease the maintenance effort for the editors working on Wikipedia. At the same time, it will also enable new types of Wikipedia pages and applications, including dynamically-generated timelines, maps, and charts; automatically-generated lists and aggregates; semantic search; light question & answering; etc. And because all these data will be available as Open Data in a machine-readable form, they will also benefit thrid-party [sic] knowledge-based projects at large Web companies such as Google, Bing, Facebook and Yahoo!, as well as at smaller Web startups…

Asked for comment by CNet, Andrew Lih, author of The Wikipedia Revolution, called it a “logical progression” for Wikipedia, even as he worries that Wikidata will drive away Wikipedians who are less tech-savvy, as it complicates the way in which information is recorded.

Also cautious is SEO blogger Pat Marcello, who warns that human error is still a very real possibility. She writes:

Wikidata is going to be just like Wikipedia in that it will be UGC (user-generated content) in many instances. So, how reliable will it be? I mean, when I write something — anything from a blog post to a book, I want the data I use in that work to be 100% accurate. I fear that just as with Wikipedia, the information you get may not be 100%, and with the volume of data they plan to include, there’s no way to vette [sic] all of the information.

Fair enough, but of course the upside is that corrections can be easily made. If one already uses Wikipedia, this tradeoff is very familiar.

The most critical voice so far is Mark Graham, an English geographer (and a fellow participant in the January 2010 WikiWars conference) who published “The Problem with Wikidata” on The Atlantic’s website this week:

This is a highly significant and hugely important change to the ways that Wikipedia works. Until now, the Wikipedia community has never attempted any sort of consistency across all languages. …

It is important that different communities are able to create and reproduce different truths and worldviews. And while certain truths are universal (Tokyo is described as a capital city in every language version that includes an article about Japan), others are more messy and unclear (e.g. should the population of Israel include occupied and contested territories?).

The reason that Wikidata marks such a significant moment in Wikipedia’s history is the fact that it eliminates some of the scope for culturally contingent representations of places, processes, people, and events. However, even more concerning is that fact that this sort of congealed and structured knowledge is unlikely to reflect the opinions and beliefs of traditionally marginalized groups.

The comments on the article are interesting, with some voices sharing Graham’s concerns, while others argue his concerns are overstated:

While there are exceptions, most of the information (and bias) in Wikipedia articles is contained within the prose and will be unaffected by Wikidata. … It’s quite possible that Wikidata will initially provide a lopsided database with a heavy emphasis on the developed world. But Wikipedia’s increasing focus on globalization and the tremendous potential of the open editing model make it one of the best candidates for mitigating that factor within the Semantic Web.

Wikimedia and Wikipedia’s slant toward the North, the West, and English speakers are well-covered in Wikipedia’s own list of its systemic biases, and Wikidata can’t help but face the same challenges. Meanwhile, another commenter argued:

The sky is falling! Or not, take your pick. Other commenters have made more informed posts than this, but does Wikidata’s existence force Wikipedia to use it? Probably not. … But if Wikidata has a graph of the Israel boundary–even multiple graphs–I suppose that the various Wikipedia authors could use one, or several, or none and make their own…which might get edited by someone else.

Under the canny (partial) title of “Who Will Be Mostly Right … ?” on the blog Data Liberate, Richard Wallis writes:

I share some of [Graham’s] concerns, but also draw comfort from some of the things Denny said in Berlin – “WikiData will not define the truth, it will collect the references to the data…. WikiData created articles on a topic will point to the relevant Wikipedia articles in all languages.” They obviously intend to capture facts described in different languages, the question is will they also preserve the local differences in assertion. In a world where we still can not totally agree on the height of our tallest mountain, we must be able to take account of and report differences of opinion.

Evidence that those behind Wikidata have anticipated a response similar to Graham’s can be found on the blog Too Big to Know where technologist David Weinberger shared a snippet of an IRC chat with he had with a Wikimedian:

[11:29] hi. I’m very interested in wikidata and am trying to write a brief blog post, and have a n00b question.
[11:29] go ahead!
[11:30] When there’s disagreement about a fact, will there be a discussion page where the differences can be worked through in public?
[11:30] two-fold answer
[11:30] 1. there will be a discussion page, yes
[11:31] 2. every fact can always have references accompanying it. so it is not about “does berlin really have 3.5 mio people” but about “does source X say that berlin has 3.5 mio people”
[11:31] wikidata is not about truth
[11:31] but about referenceable facts

The compiled phrase “Wikidata is not about truth, but about referenceable facts” is an intentional echo of Wikipedia’s oft-debated but longstanding allegiance to “verifiability, not truth”. Unsurprisingly, this familiar debate is playing itself out around Wikidata already.

Thanks for research assistance to Morgan Wehling.

Verifiability and Truth: What John Siracusa Doesn’t Get About Wikipedia

Tagged as , , , , , , , , ,
on February 2, 2012 at 6:50 pm

One of my favorite podcasts is Hypercritical, co-hosted by and principally featuring the thoughtful criticisms of John Siracusa, a sometime columnist for Ars Technica and Internet-famous Apple pundit. The show’s tagline calls it: “A weekly talk show ruminating on exactly what is wrong in the world of Apple and related technologies and businesses. Nothing is so perfect that it can’t be complained about.” Last week’s edition—“Marked for Deletion”—was about something far from perfect, but of great interest to this blog: Wikipedia.

If you want to listen for yourself, jump to about 1:11:55 (yes, more than an hour into the show) where Siracusa and co-host Dan Benjamin turn the discussion to Wikipedia. And a warning: this is going to be long. Consider it homage.

♦     ♦     ♦

Promisingly, Siracusa begins by asking his co-host to answer, if he can, “what Wikipedia is”. The answer is pretty good for an outsider: it’s a place for sharing information and collaboratively building a resource for (hopefully) accurate information on almost any topic. In general, this will do. But it’s not quite right, as Siracusa explains by recounting his personal experience of trying, in vain, to defend an article from deletion. With five years to reflect on it, Siracusa describes his efforts as a “prototypical example of someone who does not understand what Wikipedia is, proving that he does not understand what Wikipedia is.”

All of this is a way of getting to Siracusa’s fascination—one might say morbid fascination—with Wikipedia’s policy of “Verifiability”. The first paragraph of the policy says:

Verifiability on Wikipedia is the ability to cite reliable sources that directly support the information in an article. All information in Wikipedia must be verifiable, but because other policies and guidelines also influence content, verifiability does not guarantee inclusion. The threshold for inclusion in Wikipedia is verifiability, not truth—whether readers can check that material in Wikipedia has already been published by a reliable source, not whether editors think unsourced material is true.

Or as Siracusa summarizes it: “Something can be as true as you want it to be, if it is not verifiable, it doesn’t go in.” Well said.

He also discusses the related policy of “No original research”. This includes a good explication of the different types of sources that may or may not be used on Wikipedia: primary sources (original documents and first-hand accounts), secondary sources (news articles interpreting primary sources) and tertiary sources (encyclopedias and academic articles summarizing the former). This is advanced stuff, and for a longtime Wikipedian, it’s no small thrill to hear a smart outsider explain why secondary sources are preferred, and work through the fundamental policies of Wikipedia. Siracusa correctly observes: “Wikipedia is not a place where you write down stuff that you know. … Wikipedia writes about other people writing about things.”

Except here’s the thing: Siracusa understands Wikipedia’s core content policies. He just doesn’t like them.

In his particular example, a former standalone article called FTFF (here’s what it used to look like) didn’t survive the process not because it wasn’t true, but (he says) because it contained material that wasn’t verifiable, and constituted original research. This is partly true, but it owes more to a guideline that got only passing mention on the show (and, frankly, in the deletion debate): “Notability”, and specifically the “General notability guideline”. It’s closely tied in with WP:VERIFY and WP:ORIGINAL, and basically says that a topic must have sufficient coverage in secondary sources to be given its own standalone page. FTFF was not, and the result of the debate was to merge the topic to Finder_(software)#Criticism.

Anyway, this pedantry about WP:NOTE and WP:GNG doesn’t affect Siracusa’s main point: If something is true but unverifiable, he would like to see it included in Wikipedia anyway. Nor does it affect his corollary argument, that Wikipedia’s complex rules discourage many would-be participants.

He’s undoubtedly right about the second point: many people try to get involved with Wikipedia who have no idea what it’s really about, and they tend to have a really bad experience. Wikipedia struggles to explain itself to outsiders, and it probably always will.

As to the former, the problem is that he fails to grapple with the implications of the Wikipedia he describes, and this is disappointing. By privileging “truth” above “verifiability”, one gets the impression he’s describing a Rashomon-like Wikipedia where all possible viewpoints are explored, and somehow eventually Wikipedia just makes the right call. This assumes a lot, not least that contentious topics wouldn’t simply devolve into edit wars of unchecked aggression. In a world where Wikipedia aims for truth but eschews verifiability, there are no footholds upon which to steady an argument. There is no way to know what should be considered credible or otherwise.

At times it actually sounds like he’s advocating something that already exists: reliance on “Consensus” for determining how Wikipedia will address the topics it covers. Wikipedia policies and guidelines don’t cover everything, and this is where consensus steps in, however imperfectly. If you’ve ever wondered why there is sometimes an observable discrepancy in the depth or quality of coverage between topics, consensus is the big reason why, and moreso the self-selection that shapes consensus. The current, real-world Wikipedia refers to outside authorities as well as consensus among editors; Siracusa’s Bizarro World Wikipedia would jettison the former and rely solely on the latter.

Meanwhile, Siracusa ascribes Wikipedia’s Byzantine rule structure to Wikipedians’ desire for approval from educators and academics, which he thinks is holding back Wikipedia from what it could become. He repeatedly says “Wikipedia should be something different” and refers to “what’s different about online” but he never gets prescriptive and never actually says why the old methods are outmoded. He does say his Wikipedia would seek to “arrive at truth using every tool necessary” and would, for example, allow original research… but what then is the mechanism for (dare I say) verifying it?

At one point, Siracusa compares the popular, widely-viewed Ars Technica forums to a hypothetical low-circulation print magazine, and complains that the widely-read former site is an invalid source while the unpopular latter publication is acceptable. It’s true that Wikipedia does not necessarily take a populist approach to evaluating sources, but he’s far off the mark in his attempt to explain this: “They’re not cool with the old librarians, because they’re not paper.”

I hope that he was just being lazy and doesn’t actually think that Wikipedia editors prefer paper (if anything they actually prefer online sources, which are easier to check) but he completely misses a key dynamic that ties back to verifiability: the paper magazine with poor circulation at least will have editors who are presumed to care about fact-checking and accuracy. A web forum, however popular it may be, may have moderators, but that’s not the same thing as having an editor. A discussion group is not an editorial operation, period. The forum is a primary source, and so should only be used to support reliable sources.

There are, however, reliable web sources. One of them is the editorial side of Ars Technica; no less an authority than John Siracusa has been cited in approximately 150 different Wikipedia articles about the Macintosh and other technology subjects.

♦     ♦     ♦

I’m sorry to say this, but in the show’s last fifteen minutes, Siracusa pretty much descends into total incoherence. Here’s his summary statement, close to verbatim:

[There are] many flaws in verifiability and reliability of sources. It’s built on a foundation of sand. Notability, what’s a reliable source, those things become so key to making Wikipedia crappy or good, and those sands are constantly always shifting, you know? And so if Wikipedia was centered on truth and that was its final goal, yeah, it would have to include citations and verifiability and stuff like that, but there would never be any argument when the two are in conflict. You know, if you could prove that a series of events happened here, then you could say, well, it’s verifiable, it appeared in a reliable source, but it’s not the truth. And so therefore we should expunge that. Because the final goal of Wikipedia is truth. But the final goal of Wikipedia is not truth, it’s verifiability.

There would “never be any argument” about what is the truth? In the parlance of Wikipedia: [citation needed].

Look, this is an epistemological issue, one much larger than just Wikipedia. The reason Wikipedia’s goal is verifiability, not truth, is because verifiability is an achievable goal. In fact, verifiability is a necessary step toward establishing truth, as Siracusa at this point seems to acknowledge in his imagined alternate, truth-seeking Wikipedia.

It’s not that Wikipedia is actively hostile to the truth: it’s just agnostic as to what it might be. Wikipedia articles are like road signs; truth itself may be unknowable, and we may never arrive at our destination, but Wikipedia can point in the right direction. Wikipedia’s policies and guidelines are designed to make sure that its content does that, although it’s fair to acknowledge that it’s not guaranteed. But what is? And what is truth?

Anyway, there’s a user essay on Wikipedia called “Verifiability, not truth” that says this better than I am going to. Here’s the key point:

That we have rules for the inclusion of material does not mean Wikipedians have no respect for truth and accuracy, just as a court’s reliance on rules of evidence does not mean the court does not respect truth. Wikipedia values accuracy, but it requires verifiability. Unlike some encyclopedias, Wikipedia does not try to impose “the truth” on its readers, and does not ask that they trust something just because they read it in Wikipedia. We empower our readers. We don’t ask for their blind trust.

If you want to upset the old system and do something new, you actually do need to think through what should replace it. Siracusa never does.

If he thinks Wikipedia’s adherence to “old world” rules is driving away contributors, he should consider what the free-for-all alternative would look like. It isn’t a Wikipedia I would spend any time with, it’s not one that Google would be eager to rank so highly, and it wouldn’t be the most important reference site on the Internet.