

This past March, researchers from the Anti-Defamation League accused Wikipedia of biased coverage of the Israeli-Palestinian conflict. They found that a group of editors had coordinated to produce thousands of anti-Israel entries, and that the platform’s administration had failed to respond. This is not the first example of biased parties abusing Wikipedia’s editorial policies. A variety of Wiki sections in multiple languages have been subjected to organized “raids” carried out by state propagandists, far-right activists, and even terrorist supporters. Some critics believe that the Web's main encyclopedia needs reforms, such as the introduction of user verification. Others doubt such measures will help, questioning the viability of collective content moderation in the digital age. Meanwhile, Wikipedia is gaining unprecedented influence as a training dataset for major AI language models, and distortions in its content have already affected the responses that the resulting chat tools give.
Content
Organized editorial teams
The Battle of the Russian Wiki
Linguistic autonomy
State intervention
Defenseless mechanisms
Less freedom, more responsibility
A group of pro-Palestinian editors has been coordinating changes to English-language Wikipedia entries for at least a decade, according to a report released by the U.S.-based Anti-Defamation League (ADL) on Mar. 18. Over the course of ten years, the same 30 people made more than 1.6 million edits to articles related to Israel and Palestine. ADL analyzed these users' activity, comparing it to the behavior of the 30 most active editors of the page covering the war between Israel and Hamas, those involved in editing the topic of U.S.-China relations, and 30 figures chosen at random from among English-language Wikipedia’s 5,000 most active editors.
30 people have made more than 1.6 million edits to articles related to Israel and Palestine in 10 years
It turned out that the “bad-faith” editors were significantly more active than those of other groups, and their activity spiked after Hamas attacked Israel on Oct. 7, 2023. Whereas the group had previously averaged 25 edits per day (already enough for a full work shift), after the outbreak of the war their workload was ramped up to 45 daily edits.
In tens of thousands of cases, different authors from this group edited the same entries simultaneously, indicating a concerted effort. Wikipedia has a self-regulatory mechanism in which users vote to remove or keep controversial pages. Voting on Arab-Israeli topics also revealed organized participation by “bad-faith” editors.
Organized editorial teams
The ADL report is not the first example of an “Arab-Israeli conflict” being fought out on Wikipedia. In October 2024, Ashley Rindsberg, editor-in-chief of Pirate Wires, published a similar investigation that identified 40 editors who'd joined forces to rewrite numerous English-language entries. Notably, they had removed from the Hamas page a paragraph of the group’s charter calling for the destruction of Israel and the murder of Jews.

Ashley Rindsberg
Edits are not limited to contemporary events. Pro-Palestinian editors have tweaked articles about the ancient history of the Jewish people and removed references to the Palestinian Mufti of Jerusalem collaborating with Hitler. Some articles on Israeli topics had up to 90 percent of their content edited, Rindsberg found.
The same users have tried to remove information about human rights abuses in Iran or else merely downplay the scale of repression there, Dr. Shlomit Aharoni Lir, a researcher at Israel's Bar-Ilan University, tells The Insider. Until Oct. 7, she was primarily focused on Wikipedia's gender bias and the lack of women's visibility on the platform. But she could not overlook the new phenomenon:
“These people work shifts, eight to nine hours a day. One editor went through all the articles mentioning Israel's War of Independence and changed the title to 'The Palestinian War of 1948.’”
In January 2025, the Wikimedia Foundation, which formally runs the free encyclopedia, took limited action, blocking six pro-Palestinian editors mentioned in the Rindsberg investigation, along with two pro-Israel users. But the situation hasn't changed, Aharoni Lir says: “Every edit I made on conflict-related topics was reverted almost immediately. For one, authenticated images for the article ‘Use of human shields by Hamas,’ which were accepted on other language sections of Wikipedia, were deleted from the English Wikipedia without a sound excuse.”
Wikipedia's protection mechanism was weak from the very beginning, says Dmitry Khomak, creator of the Russian Internet culture encyclopedia Lurkmore: “The fundamental problem with Wikipedia is that it is very difficult to defend against the actions of an externally coordinated team.”
The Battle of the Russian Wiki
Had it not been for a happy coincidence, the article on the Russian invasion of Ukraine in the Russian-language Wiki might well have been titled “Special Military Operation in Ukraine.” The Wiki system provides for multi-stage indirect voting, but ultimately, the agenda is set by the majority. When the full-scale war in Ukraine began, its opponents enjoyed a numerical advantage that itself was the result of the exposure of a group of pro-Russian editors in 2021, says longtime Wiki editor Andrei (whose name has been changed): “Since 2014 or so, the Russian Wikipedia has been dominated by people we would now call 'zetniks' [Putin regime loyalists]. Pro-Ukrainian authors were banned and thrown out, and the arbitration committee did nothing,” Andrei recalls, referring to the group of experienced users tasked with resolving disputes on the platform.
Since 2014, the Russian Wikipedia has been dominated by Putin regime loyalists
It was later revealed that members of the same pressure group were also members of the Russian-language Wikipedia arbitration panel. In 2021, the Wikimedia Foundation banned the leader of this group, an editor with the username A.Vajrapani.
Later that year, the foundation also foiled an attempt to interfere in elections to the Russian-language Wikipedia arbitration committee. The candidates shared roughly the same views as A.Vajrapani and enjoyed backing from digital entrepreneur and Russian nationalist Sergei Nesterovich.
Wikipedia editor Andrei is sure that the campaign was an attempt by Kremlin loyalists to take over the Russian Wiki:
“Nesterovich tried to manipulate the arbitration committee elections. But he showed his hand by bragging about his influence to the press. When this came to light, the whole group was removed.”
Another organized attempt to doctor conflict-related content on the Russian-language Wikipedia is the case of the so-called “Azerbaijani mailing list.” In 2009, 24 users — later nicknamed “Baku commissars” in internal correspondence — decided to coordinate edits to articles on the Armenian-Azerbaijani conflict. The mailing list consisted of combatants from the Azerbaijani side. According to Andrei:
“Some of them were even associated with the Azerbaijani Academy of Sciences and provided sources for edits. Eventually, an Armenian infiltrated the group, published the mailing data, and brought it before the arbitration committee.”
Azerbaijani influence lasted only a year, but the pro-Russian editors who took over controversial themes in the Russian Wiki after the annexation of Crimea held sway for nearly eight years. In Andrei's opinion, that was possible because no neutral majority capable of settling differences around the topic of the Russian-Ukrainian conflict existed.
Pro-Russian users who hijacked political topics on the Russian Wiki after the annexation of Crimea held sway for nearly eight years
“Normally, any dispute among editors involves a few, at most a few dozen people. There is a large neutral mass to help resolve the conflict. But after 2014, the neutral core just wasn't there. As a result, no decisions could be made without an investigation or scandal,” Andrei says. “I think the situation is similar among the editors of the English-language Wiki on the issue of Israel's war with Hamas.”
Linguistic autonomy
In the context of the Gaza war, the case of the Arab Wiki stands out. In the fall of 2023, the section placed a Palestinian flag on its home page and kept it there until the end of 2024.
Articles about the parking lot bombing at Al-Ahli Hospital in Gaza vary greatly depending on the language. The Russian and English pages both say that the explosion occurred in the parking lot of the medical facility but provide different casualty tolls. They also mention the Palestinian version of the IDF strike on the hospital and analyze Israel's argument that a rocket from a local group landed on the parking lot. In Arabic, however, the article is titled “Baptist Hospital Massacre.” The blame is unequivocally placed on Israel, and the higher casualty toll it cites is backed only by Hamas-controlled sources.
The Arabic Wikipedia article on the bombing outside Al-Ahli Hospital in Gaza is titled “Baptist Hospital Massacre”
Wikipedia has already witnessed instances of biased groups taking over the management of individual language communities. The Wikimedia Foundation intervened on at least two occasions.
One prime example is Wiki in Chechen, which since its creation has been controlled by political émigrés who support the recognition of an independent Ichkeria. The title page was decorated with the Ichkerian flag. Articles about fighters who used terrorist methods in their struggle for independence were laudatory, while those about the current Chechen leader — Putin ally Ramzan Kadyrov — contained insults.
The conflict between foreign-based and Russia-based editors of the Chechen Wiki involved threats of physical violence, and in 2013, the Wikimedia Foundation stepped in. The previous administrators were removed and new ones were appointed. Now the articles are short and neutral in content. For instance, the article on the First Chechen War (1994-1996) is limited to a brief chronology of events.
Another case of external intervention involves the Croatian Wiki, which in the 2010s fell into the hands of the country’s far-right. They made homophobic edits, praised the Ustasha regime, and justified the Holocaust. In 2013, the Croatian minister of education even urged school and university students against using Wikipedia. In 2021, the Wikimedia Foundation launched an investigation and revoked the credentials of several administrators.
In 2013, the Croatian minister of education urged school and university students not to use Wikipedia due to far-right manipulation of the platform
While the Chechen and Croatian Wikipedias are relatively small, another case of external interference concerns a much larger language section: the Chinese Wiki. In September 2021, the Foundation blocked seven active pro-government editors and revoked the administrative powers of 12 others. They were accused not only of coordinated edits but also of intimidating and harassing democratically inclined users.
State intervention
State governments have three ways to control Wikipedia, explains editor Andrei. They can restrict access to the site, hire editors with “correct” views, and repress those in opposition. China has tried all three approaches, alternately blocking and unblocking Wikipedia several times.
In the early days of the Chinese Wiki, users loyal to Beijing tried to edit articles on painful topics. Predictably, the main battles unfolded on the page of the 1989 Tiananmen Square massacre. The first version of the article, published by an anonymous user, consisted of only three phrases. It claimed that government troops had seized control of Tiananmen Square after it became “a camp for a variety of hostile forces,” The Washington Post reported.
Some editors tried to add information about the slain protesters, but others deleted those edits. In June 2004, the 15th anniversary of Tiananmen, China blocked Wikipedia for the first time.
After a while, the ban was lifted and then reintroduced. The government also tried restricting access to individual pages before finally banning all of Wikipedia in 2019. On Oct. 24, 2020, Zhoushan police arrested a local for trying to gain “illegal access” to the encyclopedia.
In 2020, a resident of China was arrested for attempting to gain “illegal access” to blocked Wikipedia
Other non-democratic regimes have also been accused of trying to influence the free encyclopedia. In 2019, Open Democracy exposed the influence of Iran's Ministry of Culture and Islamic Guidance on the Persian-language Wikipedia. Formally, the portal is managed by a non-governmental organization, which was offered an office in the departmental building.
Unsurprisingly, the content of the Persian Wiki articles is aligned with Tehran's official narrative. In 2024, The Times detailed how Iranian admins edited and redacted references to the repression carried out by the Ayatollahs' regime. Allies of Iranian activist Vahid Beheshti told reporters that they had tried to create a page about him multiple times, but the text was always deleted.
In 2020, Saudi Arabia sentenced two Wiki editors to prison terms, The Guardian wrote, citing local human rights organizations. One of the convicts, Osama Khalid, received 32 years in prison, while Ziyad al-Sofiani received eight.

Wikipedia editors Osama Khalid and Ziyad al-Sofiani convicted in Saudi Arabia
In 2022, the Wikimedia Foundation blocked 16 users who were found to have made edits to the benefit of the Saudi government. Most notably, these concerned the article about the 2018 murder of journalist Jamal Khashoggi at the Saudi consulate in Istanbul.
Belarus has also initiated criminal cases against Wikipedia editors. On Apr. 7, 2022, a court in Brest sentenced human rights defender Pavel Pernikau to two years in a general-security prison for “discrediting the Republic of Belarus.” As evidence, the court cited Pernikau's edits in two articles in Russian and one in Belarusian. The pages were devoted to protesters killed by the regime in 2020-2021, along with the issue of government censorship.
In June 2022, a court in Minsk sentenced Mark Bernstein, a prominent editor of the Russian-language Wikipedia, to three years of restricted freedom. He was charged with “disorderly conduct” in the form of publishing articles in the online encyclopedia.
Defenseless mechanisms
Wikimedia has attempted to take measures to combat malicious editing. After Oct. 7, 2023, a new rule was introduced: only users who had made at least 500 edits were allowed to edit English-language articles on sensitive topics.
The Anti-Defamation League report cites the case of a pro-Palestinian editor who found a workaround. He selected several hundred articles mentioning LGBTQ and replaced the acronym with “LGBTQ+.”
A biased Wikipedia editor replaced the acronym LGBTQ with LGBTQ+ hundreds of times to gain the right to edit English-language pages on Israel and Palestine
Meanwhile, independent editors who work on a pro bono basis are deterred by these barriers. Dmitry Khomak cites a similar effect on Stack Overflow, a Q&A service for software developers: “When [the service] tightened its moderation rules to improve the quality and uniqueness of questions, it caused an outflux of users. Further tightening of the requirements set out for Wikipedia editors will produce the same result.”
Another important mechanism for combating propaganda is controlling the sources that can be cited in articles. The Russian Wiki considers academic publications to be the best sources, followed by credible media outlets. This list includes, for example, The Washington Post, The New York Times, and Reuters.
However, the pool of sources is also determined by a collective vote of the editors. As a result, the Persian-language Wikipedia makes heavy use of references to Iranian state media. In the context of the Israeli-Palestinian conflict, English-speaking members of the Wiki community voted to recognize the ADL as an inadmissible source. Meanwhile, they refused to exclude the Qatari outlet Al-Jazeera, which Israel considers to be a Hamas propaganda mouthpiece.
Wikipedia's pool of credible sources is determined by a collective vote of the editors, so the Iranian Wiki actively quotes state media
The imbalance between references to “right-wing” and “left-wing” publications on Wikipedia is being hotly debated in the American press right now. Conservative media outlets are far less likely to be recognized as credible sources, notes The New York Post. Not surprisingly the Republican-aligned Post is one of the publications that Wikipedia editors consider to be of dubious credibility.
New Zealand professor David Rosado noted in a recent study that the English-language Wikipedia describes Republican politicians and journalists in a markedly less positive emotional context than their Democratic opponents.
Rosado found that the same distortions pop up in ChatGPT too, the result of Wikipedia being a major text base for AI training. Notably, disputes around the accuracy of Wikipedia articles affect even those who do not use the encyclopedia.
Wikipedia offers a large database of texts for AI training, so changes to its articles affect chatbot replies
Another important factor determining the influence of the free encyclopedia is the fact that its articles always appear at the top of the search engine results. That’s why the ADL warns AI companies against using Wikipedia to train large language models, and the group also suggests that search engines should not display Wiki articles at the top of their results. In addition, the ADL proposes displaying translations of English-language versions instead of original articles for some language versions.
These ideas are especially relevant for articles on controversial topics, according to ADL, at least until Wikimedia changes its policy.
Less freedom, more responsibility
The Anti-Defamation League report concludes by offering specific advice that Wikimedia reconsider its approach to sensitive topics. The recommendations include creating a pool of experts on Israel and the Israeli-Palestinian conflict. The experts should be verified by the foundation and should moderate disputes that arise.
The ADL appeals to the precedent of the COVID-19 pandemic. Back then, page edits on controversial topics were scrutinized by a team of medical experts. Instead of voting by simple majority, a narrow circle of specialists made the decision.
Wikipedia editors are skeptical of these proposals. The ADL is asking too much of Wikimedia, editor Andrei believes:
“I find this solution rather odd. The Foundation's interference in the administration of language sections is very rare and hardly ever concerns the content of the articles. It just provides servers, domains, and infrastructure.”
Indeed, Wikimedia does not usually take responsibility for resolving conflicts. Despite having the power to completely replace the pool of administrators of an entire language section — which indeed happened in the cases of the Croatian, Chechen, and Chinese Wikis — the Foundation positions itself as a community of authors. It rarely issues statements to the press and regularly responds to defamation suits by citing sources. In the U.S., such lawsuits against Wikipedia almost always fall apart.
In the U.S., defamation lawsuits against Wikipedia almost always fall apart
However, a lawsuit that is unfolding in India could set a precedent regarding Wikimedia's legal liability. The foundation is being sued by the local news agency Asian News International (ANI) over an article claiming ANI spreads state propaganda and fake news.
Wikimedia first responded that the foundation “does not add or correct content” and that editorial decisions are made by a “global community of users.” However, the court found Wikimedia itself to be the proper defendant.
The case is being heard in Delhi, and Wikipedians have created a page about the trial. This is a common practice on the platform, but the Delhi High Court considered it an attempt to influence the proceedings and ordered the page to be removed. Wikimedia is now challenging this decision in India's Supreme Court.
What matters in this process is not the history of a particular news outlet, but the fact that Wikimedia has been compelled to respond to the claims before a judge, says Dr. Aharoni Lir. She notes another crucial point: at the request of the court, the Foundation disclosed the details of the users who had corrected the article.
In India, the Wikimedia Foundation has disclosed user data at the request of the court
According to Dr. Aharoni Lir, Israel's Justice Ministry is now also looking into the possibility of suing Wikipedia. To protect users against manipulative articles, the researcher suggests that when it comes to a certain range of topics, potential editors should be required to prove their identity. As Dr. Aharoni Lir points out, open verification brings with it the potential for legal liability and therefore acts as a deterring factor for those looking to spread biased content. “We know that anonymous users are much more licentious than non-anonymous ones,” Aharoni Lir says.
However, Dmitry Khomak believes that none of the proposed options will work, simply because intentional manipulators will still be able to fool any potential moderator:
“Wikipedia's traditional self-regulation model stops working when authoritarian regimes hire people to purposefully edit articles. It is important to observe the formalities. Edits must be made consistently, by different participants. Each should insist on their position and generate links. There is nothing moderators can do about coherent text with linked sources.
Thus, pro-Palestinian editors put in links to fake magazines and books. They write total crap, but neither a live moderator nor AI can check it: if there's a link to a real book, everything is technically correct. With the advent of AI, you can generate text on any topic in two clicks. And the state apparatus always has more resources than you. The Internet has outgrown the crowdsourcing phase and needs to evolve in a different direction.”