November 23rd, 2014
(written by lawrence krubner, however indented passages are often quotes). You can contact lawrence at: firstname.lastname@example.org
Back in 1994 LR Iannaccone wrote “Why strict churches are strong” and the gist of the argument was:
The strength of strict churches is neither a historical coincidence nor a statistical artifact. Strictness makes organizations stronger and more attractive because it reduces free riding. It screens out members who lack commitment and stimulates participation among those who remain. Rational choice theory thus explains the success of sects, cults, and conservative denominations without recourse to assumptions of irrationality, abnormality, or misinformation. The theory also predicts differences between strict and lenient groups, distinguishes between effective and counterproductive demands, and demonstrates the need to adapt strict demands in response to social change.
In his introduction, Jemielniak explains he will “try to solve the puzzle of why Wikipedia’s novel organizational design works; it should not, but it does.” One challenge is that—while the number of people involved is small—Wikipedia is far from a tightly organized cabal. In fact, nobody is in charge. (Even co-founder Jimmy “Jimbo” Wales analogizes his own role to that of the British monarch.) Except in the rarest instances, Wikipedia is a self-governing, often fractious, community that relies on finding consensus after extended debate.
Without central direction, Wikipedians have evolved an elaborate system of rules and controls from a few core principles and practices, beginning with what are known as the five pillars, currently summarized on the site as follows:
1. Wikipedia is an encyclopedia.
2. Wikipedia is written from a neutral point of view.
3. Wikipedia is free content that anyone can use, edit, and distribute.
4. Editors should treat each other with respect and civility.
5. Wikipedia has no firm rules.
The neutral point of view rule (NPOV, in Wikipedia lingo) accompanies two other “core content policies”: verifiability (V) and a ban on original research (OR), which together require that “all material in Wikipedia must be attributable to a reliable, published source.” An editor cannot simply add a fact. (In a notorious case, Philip Roth petitioned Wikipedia to correct a description of the origin of his novel The Human Stain, only to be told by a Wikipedia administrator, “I understand your point that the author is the greatest authority on their own work, but we require secondary sources.”)
To work, the rules need both enforcement and interpretation. Someone, somehow, has to exercise authority. Despite what the phrase suggests, a “spontaneous order” that evolves without top-down control doesn’t arise by magic. It is the result of many small, decentralized decisions. In Wikipedia’s case, those begin with its basic activity: writing encyclopedia entries. Editors make changes in articles, add references and images, and alter or reverse (“revert”) bad edits. Some concentrate almost entirely on monitoring vandalism, bad-faith changes, and violations of Wikipedia guidelines. Contrary to widespread belief, the most controversial pages are often the most substantial and balanced, as editors with competing views add information, subject to Wikipedia’s rules about neutrality and sourcing. The George W. Bush article, for example, is remarkably detailed and dispassionate. Blatant inaccuracies and biased statements tend to be swiftly eliminated.
On separate discussion pages, Wikipedians hash out disagreements about article content. Disputes end only when an arrangement acceptable to all participants is reached. (This makes persistence as decisive as persuasive arguments. “Tiring out one’s opponent is a common strategy among experienced Wikipedians,” admits Jemielniak. “I have resorted to it many times.”) Wikipedians also debate organizational issues, such as whether to promote editors who’ve applied to become administrators, the most common managerial role. They consider what new rules to adopt and how, or whether, to sanction individuals who violate policies and norms. These debates and discussions give participants a strong sense of agency and belonging.
The us-against-them attitude threatens Wikipedia’s future, as existing editors drift away and aren’t replaced.
All these practices make Wikipedia work as well as it does. As even ardent Wikipedians would admit, though, they come with considerable costs. “[T]he amount of peer control that all Wikipedians experience is extremely high,” writes Jemielniak. “There is a Panopticon-like record of everyone’s actions, and Wikipedia’s control of participation through a high degree of regulation and procedures would be surprising even for a highly bureaucratic organization.”
In theory, anyone can contribute to Wikipedia articles and anyone can propose a new policy or rule. In reality, Wikipedia functions as a largely closed community, using procedural knowledge and a sort of passive-aggressive resistance to deter outsiders. Wikipedia’s professed egalitarianism means that status in the outside world, including deep subject expertise, counts for nothing. But an internal hierarchy still exists. Status comes from mastering Wikipedia’s ever-increasing rules, engaging in community debates, and earning a reputation as hardworking. Racking up sheer numbers of edits matters a great deal, and Jemielniak laments that “a thousand minor corrections help raise organizational standing more than creating a perfect one-thousand-word article does.”
Official policies tell editors to tolerate newcomers’ innocent mistakes (“Please do not bite the newcomers”), but active editors often reverse newbies’ contributions without explanation. “Activists have been at it five and 10 years and don’t tolerate little mistakes,” says Jensen, an editor since 2005. He recalls running a workshop in which a well-known expert on Montana history tried to add a paragraph to the site, only to see it immediately erased.
Editors distrust newcomers for a reason: bitter experience. “Trolls come,” Jemielniak tells me in an interview. “If you spend time reviewing recent changes, after an hour or two you will have a feeling that the world is composed mostly of primary school students and cranks.” Some vandals simply replace an article’s text with random characters: destruction for its own sake. Instead of improving article content, editing often means acting as a human spam filter. Jemielniak and others may decry Wikipedians’ emphasis on edit numbers, but valuing lots of small changes, even out of testosterone-fueled competitiveness, has an unsung benefit: It encourages editors to discover and repair damage. Eternal vigilance keeps the site’s contents from decaying.
But the us-against-them attitude threatens Wikipedia’s future, as existing editors drift away and aren’t replaced. Consider the career of Adrianne Wadewitz, a literary scholar and much-admired Wikipedia editor who died last April from rock-climbing injuries. She began her editing in 2004 and ultimately contributed more than 49,000 edits to the site, including the creation of many entries on female authors and women’s history. But her newbie contributions would never survive today. With their declarations that Samuel Richardson’s Pamela “began an eighteenth-century tradition of epistolary novels” and that “biting social commentary and masterful use of both indirect speech and irony eventually made [Jane] Austen one of the most influential and revered novelists of the early nineteenth century,” Wadewitz’s authoritative summaries of scholarly opinion defied the prohibition on original research and lacked footnotes to verified sources. Back then, she could get away with it, and these early edits laid the foundation for articles that are now well footnoted. But on today’s more mature and wary Wikipedia, such good-faith but rule-breaking contributions tend to get immediately reverted rather than gradually improved.
Wadewitz was exactly the sort of knowledgeable enthusiast the site needs. But those who would take her place are likely to be frightened off. A study led by Aaron Halfaker, a Wikimedia Foundation research scientist, found that in 2011 less than 10 percent of “desirable newcomers” making good-faith changes continued editing for at least two months, compared to more than 25 percent in the first half of 2006, when the community of editors was still growing.
None of this would matter if Wikipedia were a finished product. But it isn’t. As time goes on, there’s always more to add, and many articles still fall short of encyclopedia quality. A good Wikipedia page adheres to the rules about neutrality and sourcing, but it’s likely to offer a patchwork of facts without the context explaining why these particular facts and not others were included. That’s why, for instance, the historian Malcolm Rohrbough, author of Days of Gold: The California Gold Rush and the American Nation, describes the article on the California Gold Rush as “a collection of miscellaneous information, most of it accurate.” Certainly, it’s possible to write articles that offer context while following the rules, but it requires editors with both writing skills and a deep knowledge of potential sources.
An online encyclopedia also faces challenges its print equivalents never did. How, Jemielniak wonders, will Wikipedia adapt its contents to Google Glass displays? How will its multilingual sites employ improved automated translation? Meeting an unknown future requires flexibility, and with its bureaucracy, jargon, and endless arguments, Wikipedia hardly looks sufficiently nimble.
Of course, betting against Wikipedia has long been an empirically foolish proposition. “The thesis that the community will blow up itself has clearly proven wrong over the last 11, 12 years,” Jemielniak says. But the perils of adolescence were one thing. What about middle age?