人兽性交

Skip to main content Accessibility

Wikipedia wars: inside the fight against far-right editors, vandals and sock puppets

Like Facebook, Google and Twitter, Wikipedia has become a fixture of online life.

With more than five million articles, it is the world鈥檚 go-to source for all kinds of information. However, the free encyclopedia鈥檚 openness and anonymity leave it vulnerable to manipulation by neo-Nazis, white nationalists and racist academics seeking a wider audience for extreme views.

The far right has been active on Wikipedia since it first went online in 2001, but in the past two years, its presence has grown with the emergence of the alt-right and the surge in rightwing populism in Europe and North America, says administrator Doug Weller.

Throughout the years, the site has refined its processes and policies for handling disruption and fringe points of view, but as Wikipedia gets bigger, so does the challenge of ensuring its integrity and neutrality. Weller says it鈥檚 an issue of manpower. Though Wikipedia has more than 32 million registered users, only around 130,000 have used the site in the past month. The task of guarding against vandalism and bias generally falls to a much smaller core of veteran editors and admins. The sheer volume of activity 鈥 10 edits per second and 600 new pages created daily 鈥 can easily outstrip Wikipedia鈥檚 capacity to police its content.

There鈥檚 an ever-present threat that an organized faction or a group of single-purpose editors working in concert can exploit Wikipedia鈥檚 mechanisms to tilt its point of view in favor of a fringe perspective.

Civil POV

The far-right鈥檚 activity on Wikipedia is relatively easy to manage when it clearly violates one of the site鈥檚 policies, such as vandalism, harassment, ad hominem attacks or edit-warring, which refers to dueling edits by one or more factions outside of the normal dialogue mechanism. But it鈥檚 much harder to keep editors in check when they dance on the line between acceptable and unacceptable behavior 鈥 this is known as 鈥渃ivil POV.鈥 is when an editor or a group of editors tries to tilt an article to a particular point of view but remains polite and abides by site-wide norms of behavior.

Wikipedia鈥檚 policies are more oriented toward conduct than content, said Magnus Hansen, a postdoc fellow at the University of Copenhagen who has been editing on Wikipedia for more than 10 years.

鈥淭hat means that it is hard to get users blocked or restricted for consistently providing ideologically skewed content, unless it can be demonstrated that they are deliberately breaking the community鈥檚 rules for conduct or content creation,鈥 Hansen said, citing the case of one user who repeatedly made antisemitic pages and when called out on it, claimed he was acting in good faith.

One of the bedrock principles of Wikipedia is the assumption of good faith. Unless there is evidence to the contrary, edits are assumed to be made with the intent of improving the encyclopedia. Editors who attempt to insert their ideological bias but maintain the semblance of civility are given the benefit of the doubt until their disruption becomes apparent enough to warrant action by administrators. Civil POV-pushers can disrupt the editing process by engaging other users in tedious and frustrating debates or tie up administrators in endless rounds of mediation.聽聽

Users who fall into this category include racialist academics and members of the human biodiversity, or HBD, blogging community. Often these are single-purpose accounts that exclusively edit on topics like race and intelligence, racial classification and bios of related researchers, like Linda Gottfredson or Helmuth Nyborg. Some have direct ties with racist journals or organizations, like Mankind Quarterly editor . Emil Kirkegaard, who edits frequently under the username , is a research fellow at Richard Lynn鈥檚 Ulster Institute for Social research and the co-founder of the online pseudojournal .

These users tend to maintain a moderate, non-confrontational tone and adopt a posture of academic neutrality, so they are less likely to run afoul of site-wide rules and more likely to make edits that stand.

One way a civil POV-pusher can nudge the narrative of a page and still comply with the rules is by adding information that is reliably sourced and factually accurate but nevertheless misleading. On the 鈥渞ace and intelligence鈥 page, William Shockley is described as a 鈥淣obel laureate,鈥 though it neglects to mention that Shockley received the prize for a field irrelevant to the topic or that he held extreme racist views. The biography of Arthur Jensen misrepresents the impact and credibility of the controversial psychologist by enthusiastically billing one of his papers as 鈥渙ne of 鈥 if not the most 鈥 cited papers in the聽history of psychological testing and intelligence research鈥 but a footnote clarifies that many of the citations were works refuting his ideas or using the paper as an example of the controversy.

Editors can also abuse Wikipedia鈥檚 guidelines and processes. For example, the restrictions on biographies of living persons, or BLP, were used to block academic criticism of Jensen and fellow racialist academic Jean-Phillippe Rushton when they were still alive.

Neutral POV, or NPOV, is another policy that requires nuance and is subject to abuse. It鈥檚 often difficult to tell if users who cite this rule are acting in good faith. Appeals to neutrality are sometimes used to soften language in articles and cast doubt on generally accepted facts. In the past, neo-Nazi users have challenged the use of the words 鈥渟laughter鈥 and 鈥渕urder鈥 with reference to the Holocaust on these grounds.

Sources

An ideal Wikipedia article should be a neutral presentation of all the notable perspectives on a particular topic drawn from reliable sources. Like everything else on Wikipedia, however, there鈥檚 room for debate on what meets the standard.

Hansen explains: 鈥淕etting consensus about what sources are reliable is very hard, since the racialists have their own venues for publication 鈥 primarily Mankind Quarterly but also Intelligence, and Personality and Individual Differences 鈥 and because one has to make an assessment of every proposed source individually in a given context and among a given group of editors. No decisions are binding across contexts or across different articles or groups of editors.鈥

Wikipedia doesn鈥檛 have a master list of unreliable sources, but it does maintain a noticeboard for fringe sources and theories, and the discussions there override other localized debates, Weller said.

Though some sources like the white nationalist American Renaissance website are easily identified as fringe, others, like the journal Intelligence, sit on the borderline. Intelligence, a peer-reviewed academic journal, is ranked 10th among journals dealing specifically with the topic of psychometrics. Based on these attributes alone, it meets Wikipedia鈥檚 criteria for a reliable source.

However, as Hansen pointed out, the journal often serves as a platform for some questionable research. Intelligence is edited by Richard Haier, who is sympathetic to the hereditarian point of view. Haier was one of the signatories of Gottfredson鈥檚 鈥溾 op-ed and he recently penned an article on Quillette defending Charles Murray.

One example illustrates just how complicated the process of establishing reliability can be and how sources with dubious credibility can still sometimes find their way onto Wikipedia. Not long ago, a reference to a paper by a man named Davide Piffer was added to the article 鈥渉istory of the race and intelligence controversy.鈥 While it initially failed to pass peer review, Piffer鈥檚 paper on the frequency of a set of intelligence-linked genes in different populations was ultimately published in Intelligence in 2015. It was subsequently hailed on American Renaissance and a variety of HBD sites as solid evidence for racial differences in intelligence.

Though the paper was published in a source considered reliable by Wikipedia, Piffer鈥檚 credentials, affiliations and the scientific merit of the paper itself are suspect. Piffer is an associate of the Ulster Institute of Social Research. In an interview with American Renaissance, Lynn, the head of the institute, referred to him as one of the 鈥渞ising stars鈥 of intelligence research, noting that 鈥渉e is from the north of Italy where the more intelligent Italians are found.鈥

At the time he published the paper, his highest level of education was a master鈥檚 in evolutionary anthropology, and his list of research interests includes parapsychology. Aside from his credentials, Piffer鈥檚 other work on fringe topics like 鈥,鈥 a form of extrasensory perception widely dismissed as pseudoscience, also raises eyebrows.

Piffer鈥檚 lack of profile poses a dilemma, too. Because it appeared in an academic journal, his paper was dubbed fit for inclusion on Wikipedia but at the same time, it鈥檚 not noteworthy enough to justify published responses from other academics, so there are no sources to supply criticism or contextualization. Ironically, one of the few credible academic responses, a , would likely not meet Wikipedia鈥檚 standards since it was published on the Unz Review, a platform for fringe writers.

In addition to issues of reliability, there is also the question of weight. Wikipedia鈥檚 guidelines state that academic sources should be given weight in proportion to their credibility and currency within established scholarship. This is to prevent a false balance in the application of NPOV guidelines, but because these determinations are made by the community as a whole, a false balance can emerge anyways.

In the article on 鈥渞ace and intelligence,鈥 relatively equal weight is given to the two sides of the debate 鈥 hereditarian and environmentalist 鈥 though environmentalism is the mainstream perspective in psychology.

Researchers associated with the Pioneer Fund, an organization with eugenicist roots that is the primary benefactor of racialist research, are mentioned slightly more often than mainstream researchers.聽 For example, Nicholas Mackintosh, author of a widely used graduate-level textbook on intelligence, is mentioned 30 times, while Rushton, the long-time president of the Pioneer Fund, is mentioned 38. Similarly, Jensen, who received more than $1 million from the Pioneer Fund in his lifetime, racked up 61 mentions versus 51 for renowned psychologist James Flynn.

Sock puppeting

Another common problem for Wikipedia is abuse of multiple accounts, or sock puppeting. Multiple accounts can be used to skirt bans and other administrative actions or to stage small dramas inside the discussion sections of articles. Some sock puppets are employed create the impression that a point of view has wider support while others serve as a strawman to the puppeteer鈥檚 argument. Editors can play both sides to get a debate started or 鈥減rime the pump.鈥 Others use separate accounts to pull a Jekyll and Hyde routine, adopting one persona for disruption while reserving a 鈥渃lean鈥 account for civil editing.

One of the white nationalists who co-founded Rightpedia, a far-right free encyclopedia that split from Metapedia, created more than 140 accounts in the past 10 years. 聽Administrators have gotten better at identifying sock puppets, but it鈥檚 still a frustrating, time-consuming process. When a sock puppet is suspected, it has to first be reported to a 鈥渃heckuser,鈥 a special type of admin who has privileges to see the user鈥檚 IP address, which is then cross-referenced against IPs of a known user.

However, there are ways to avoid detection, such as using public computers, Wi-Fi hotspots or proxies and VPNs, so often it鈥檚 hard to block a sock puppet on the basis of an IP alone. Admins have to rely on other evidence, like patterns of behavior and verbal clues. Wikipedia is working on automated solutions that use machine learning to identify certain commonly used phrases but for now, the front line of defense continues to be human editors.

Canvassing/meat puppeting

In addition to creating multiple accounts, Wikipedians can tilt the editorial processes in their favor by recruiting other likeminded individuals on Wikipedia and off-site forums. They can also call upon people they know offline. For instance, during the arbitration on 鈥渞ace and intelligence,鈥 it was discovered that two accounts had been created within 24 hours of each other on the same computer and the users, who had been tag-team editing, were in a romantic relationship. The couple had also enlisted the help of another user that they knew from the website DeviantArt.

In recent years, the proliferation of far-right online spaces, such as white nationalist forums, alt-right boards and HBD blogs, has created a readymade pool of users that can be recruited to edit on Wikipedia en masse. Before he was banned from the site, the aforementioned prolific sock puppeteer posted a message on Stormfront, decrying the domination of Wikipedia by 鈥淛ews鈥 and 鈥渃ultural Marxists鈥 and calling on others to help him.聽 A quick search of the forum reveals several recruiting members to edit. The Wikipedia entry dedicated to Stormfront has been plagued by constant initiated by one forum user who called for some help 鈥渒eeping an eye on the page.鈥

The alt-right /pol/ board on 4chan has also acted as a platform for launching attacks on Wikipedia. Not long ago, members of the board vandalized the entry for Buzzfeed by adding crude jokes and changing the name of its owner to Donald Trump. r/The_Donald sub-reddit, which has significant overlap in users with /pol/, has dozens of threads directing members to make changes to various Wikipedia entries ranging from a request to 鈥渦nc---鈥 the entry on the sub-reddit to a call for someone to fix Sean Hannity鈥檚 page. Most recently, there was a thread calling attention to Wikipedia鈥檚 labeling of Alex Jones鈥 Infowars as a 鈥渇ake news website.鈥

While many of these efforts might only result in petty vandalism and trolling, some of the threads hint at a more sophisticated attempt to influence the content of Wikipedia. Posts on both Stormfront and The_Donald include discussions about how to operate within the boundaries of Wikipedia鈥檚 policies, which could lead to activity on the site that is much more difficult to deal with.

POV forks

鈥淲atchlists鈥 are one of the main safeguards against fringe perspectives on Wikipedia. Editors can receive notifications when changes are made to a page and respond quickly to fishy edits. But when editors find their point of view blocked on a controversial article they sometimes bypass these mechanisms by starting a new page dedicated to that position called a POV fork.

While new pages are patrolled for obvious violations of Wikipedia policy, a POV fork could deal with an esoteric topic, so the volunteer patroller might not have sufficient specialized knowledge to identify problems with it. Such is the case with fringe racialist theories.

Last November, Kirkegaard created a new page on 鈥淐old winters theory,鈥 a fringe 鈥渞ace realist鈥 theory put forward by Lynn and Rushton to offer a pseudoscientific evolutionary explanation for alleged race differences in intelligence. In this case, Wikipedia鈥檚 mechanisms acted relatively quickly. The article was recommended for deletion and redirected to Lynn鈥檚 page in less than a month following a short discussion. In other cases, the process can take much longer. Wikipedians spent three months 鈥 which so closely resembled neo-Nazi propaganda that it was .

Similar forks have escaped detection. In September 2016, an entry on 鈥淒ifferential K theory,鈥 another of Rushton鈥檚 core ideas, was added by a different user. The theory, an essential component of the 鈥渞ace realist鈥 canon, is extensively covered on far-right alternative encyclopedias like Metapedia and Rightpedia as well as a number HBD blogs and YouTube channels. The article has remained up for over a year now. Though it contains one or two lines of criticism, most of the academic responses included are from other researchers affiliated with the Ulster Institute or Pioneer Fund, like Lynn and Michael A. Woodley.

If there are enough different perspectives on a topic, a POV fork might avoid deletion altogether, but they can still be problematic, especially if research on a niche topic is dominated by a fringe group of scholars. The article 鈥渘ations and intelligence鈥 gives disproportionate weight to the views of Lynn and his co-author Tatu Vanhannen because they are among the few scholars to research the subject. Much has been done to balance the article over the years, but throughout much of the article鈥檚 history it has served to promote and legitimize Lynn and Vanhannen鈥檚 thesis, and at one point it hosted charts of national IQ based on their as well as an IQ map that has become an often-shared right wing meme.

Soft Targets

Wikipedia鈥檚 watchlist system is only as good as the editors watching a given page and it functions best with high participation. A page with few watchers is vulnerable to being manipulated by a self-selected group.

The 鈥渞ace and intelligence鈥 page has more than 700 watchers while the page on the Afrikaner Weerstandsbeweging (AWB), a neo-Nazi South African separatist group, has fewer than 90. The edit history of the AWB page reveals a long pattern of edit-warring over the past year contesting the characterization of the group as 鈥渘eo-Nazi white supremacists鈥 or accurate descriptions of the group鈥檚 use of Nazi imagery. Neutrality flags have been quickly removed.

The page鈥檚 bias would hardly be obvious to a reader who is not familiar with the AWB鈥檚 history, but it omits virtually all references to the group鈥檚 violence. Of the AWB鈥檚 activity in the run-up to multiracial elections, the page says only 鈥淒uring bilateral negotiations to end apartheid in the early 1990s, the organization received much publicity鈥 and elsewhere that it 鈥渢hreatened all-out war.鈥 But it did much more than threaten.

The group unleashed a wave of violence aimed at derailing the transition to majority rule. In addition to bombing schools, ANC facilities and black taxi ranks, the AWB enforced 鈥渨hite by night鈥 racial curfews with lethal force and at makeshift roadblocks.聽 While none of the bloodiest chapters from the group鈥檚 history made it onto their Wikipedia page, there is an entire section devoted to AWB鈥檚 charity initiative known as Volkshulpskema, which is sourced entirely to the blog of South African Neo-Nazi Arthur Kemp.

Interestingly, Kemp was cited not by any far-right extremist but by a veteran editor named Zaian who has thousands of contributions to various articles on South Africa. In 2007, an anonymous editor added a section titled 鈥淰olkshulpskema and Rise to Power鈥 without references, which contained favorable comparisons to the social welfare aspects of Nazi Germany.

Apparently, Zaian was doing some housekeeping on the page and he saw a section without references, so he added the only source he could find: Kemp鈥檚 book. On another Talk page, Zaian had called out the same anonymous editor for making bizarre statements about race mixing, yet, through inattention to detail, Zaian ultimately validated the contributions of a person they described as an 鈥渆xtremist鈥 and ensured these changes would still be around more than 10 years later.

What can be done?

The presence of white nationalists and other far-right extremists on Wikipedia is an ongoing problem that is unlikely to go away in the near future given the rightward political shift in countries where the majority of the site鈥檚 users live. Wikipedia鈥檚 leadership is acutely aware of the issue, but it faces the difficult task of balancing its mission of inclusiveness and tolerance of a diversity of viewpoints with the aim of maintaining the integrity, accuracy and neutrality of its content.

When asked how Wikipedia can maintain this balance, Hansen responded: 鈥淚t cannot. Or at least that it can only do so to the extent that there are sufficient people who are聽sufficiently well versed in a specific topic and who have sufficient time and patience to keep defending the representation of it in long and tedious discussions with people who know less and use academically absurd standards of evidence.鈥 Unfortunately for Wikipedia editors, Hansen said, those people 鈥渁re often more numerous and more persistent.鈥

Comments or suggestions? Send them to HWeditor@splcenter.org. Have tips about the far right? Please email: source@splcenter.org. Have documents you want to share? Please visit: /submit-tip-intelligence-project. Follow us on .