On the hunt for spies on Wikipedia | Technology
is the headline of the news that the author of WTM News has collected this article. Stay tuned to WTM News to stay up to date with the latest news on this topic. We ask you to follow us on social networks.
Wikipedia is a reasonable object of desire for governments with a desire for influence. Social networks are a public square where many countries try to promote their agenda. But Wikipedia is a few rungs above the trust placed in it by global society: it is based on a complex system of checks and balances, and dedicated groups of editors watch and protect every change. Coordinated state interventions, if any, are complex missions.
Those battles do not imply winning in a dark corner of the internet, but in one of its pillars. The more than 10,000 editions made in several months by hundreds of editors on the English page of the “Russian Invasion of Ukraine 2022” is the highest number this year. That page is the main example, but others linked to the war, such as that of Vladimir Putin, Ukraine or Volodimir Zelenski experienced similar conflicts, although minor. To put Wikipedia in English into context, in September alone, its pages were visited 7.5 billion times and almost 1 billion in Spanish.
the new report Information warfare and Wikipedia, from ISD (Institute for Strategic Dialogue) and CSAM (Center for Social Network Analysis) analyzes how state organizations can infiltrate and modify the language of important pages. “Our work doesn’t empirically say that Wikipedia is vulnerable to any kind of measurable degree,” says Carl Miller, one of the authors and director of research at CSAM. “It tries to expose what we know about the threat. The overall point of the document is more humble: that Wikipedia is overlooked by disinformation researchers and journalists as a potential venue for information operations,” he adds.
The impact of Wikipedia is not only for those who visit it. Your information feeds, for example, the responses of Siri or Google assistants. In September 2019, a war between Chinese and Taiwanese publishers gave different answers to the question “what is Taiwan?” At one point it said: “A State in East Asia.” And in another: “A province of the People’s Republic of China”. A BBC investigation detected in this case 1,600 biased editions in 22 sensitive articles dedicated to China. He added: “We cannot verify who made these edits, why, or whether they reflect broader practice, but there are indications that they are not necessarily organic or random.” In this case, legitimate publishers were harassed or pressured into leaving.
the long battles
The primary disinformation threat is therefore not vandalism, nor the usual battles between dedicated publishers with reasonable but divergent views. In Spanish there is, for example, a long conflict over the toponymy of some Catalan, Valencian and Balearic municipalities, explains Àlex Hinojo, editor of the Catalan Vikipèdia. “It has been decided to keep the toponymy Francoist which in some places has more tradition or connotation than others: ‘San Quirico’ for ‘Sant Quirze’. But that is for the community to decide. It is the RAE and the INE who give the toponyms as valid and the Wikipedians Spanish speakers use it. It is a recurring controversy, but I do not think there is a black hand‘”, Add.
But these chronic battles have nothing to do with the concern of the group of researchers in which Carl Miller has participated. “Our intuition is that the biggest threat is entrysm [un término específico inglés que podría traducirse como “entrismo”]which is the long-term infiltration of the community by state-backed actors who build reputations within Wikipedia and then can take advantage of the underlying politics and governance processes that protect it,” says Miller.
The English Wikipedia is the target of the most sophisticated attacks. But that does not prevent other languages from being affected as well, says Santiago de Viana, editor of Wikipedia in Spanish: “I am aware of suspicions and accusations that are made about the participation of a state coordination to modify content in Spanish. But it is very different to be able to demonstrate with reliable evidence or to have sanctions carried out for this reason,” he says, adding: “During electoral periods, for example, it is common for there to be an increase in both vandalism and promotional editions of politicians, but they tend to unknown the people behind these efforts.
East entryism it is more delicate when introducing, for example, the Kremlin’s views on the invasion of Ukraine. Miller specifies these four characteristics: “One, use more subtle language changes that are within the rules, like adding the Kremlin versions; two, arrange the vote so that your fellow travelers become administrators; three, it uses admin powers to resolve conflicts, and four, it changes the actual underlying rules that govern, for example, fonts.”
This is not a hypothetical or imagined example. Wikimedia, the umbrella organization for Wikipedia, has banned and removed the administrator powers of a group of publishers in China in what it described as “infiltration of Wikimedia’s systems, including positions with access to personal information, identifiable information, and organs of influence.” elected”. In other words, alleged activists or officials directly had accessed privileged positions within the community.
What happened to the invasion of Ukraine
The particular case study used by the report concerns the 86 blocked publisher accounts that had been involved in editing the Russian invasion of Ukraine page in English. The difficulty of detecting the alleged coordinated activity of these accounts is seen in the numbers. Over seemingly years they collectively made 794,771 revisions on 332,990 pages. The dominant themes ranged from Judaism and Poland, aviation and airports, or Iraq, Libya and Syria.
Here the problems begin. What kind of editions are likely to be biased? “An edit on Wikipedia is more complex to study than a tweet or a Facebook post, because each act can involve not only adding content, but also its relocation or removal, often combining all these options,” says the report, which is focuses on issues that include using openly biased media as sources. “The team manually evaluated the edits containing these links and found that 16 of the edits were controversial and featured narratives consistent with Kremlin-sponsored information warfare,” he explains.
But when they looked beyond the Russian invasion page, they saw that this pattern of adding biased sources was more common: they found 2,421 examples in 667 pages, ranging from every conceivable Russian conflict to the Formula 1 world championship to floods in Pakistan.
“This does not necessarily imply coordination or strategic intent, but it may highlight several areas of Wikipedia that could be more closely investigated,” says the report, whose conclusion echoes Miller’s: Wikipedia is neglected. “In a world where information warfare is more pervasive and sophisticated, this worries me precisely because Wikipedia is so valuable.”
Wikipedia has prevention measures such as warnings, article protection, which restricts the type of users who can edit, and blocking both IP addresses and registered accounts. “The versions of Wikipedia in different languages have planks where the community can report disruptive or suspicious behavior,” says De Viana. “Just copy the links to the changes made by an editor, argue why there is a violation, and notify the administrators (“librarians”), who would proceed to make a decision,” she adds.
Here we see the interest of state operations to control administrator positions. Although not even that is easy: “It is very difficult for an administrator to take a controversial action without anyone noticing,” says Francesc Fort, editor of the Catalan Viquipèdia. “If I were to block a random account, someone would complain. It’s complicated”.
You can follow THE COUNTRY TECHNOLOGY in Facebook Y Twitter or sign up here to receive our weekly newsletter.