How Wikipedia Tackles Misinformation During Elections

How Wikipedia Tackles Misinformation During Elections

From bots and extensions to assisted editing programs and web applications, the Wikimedia Foundation has deployed a plethora of tools for patrolling Wikipedia this election year. In 2024, elections are taking place in over 60 countries and the European Union. More people than ever before — nearly half of the world’s population — are eligible to vote this year.

This means that more people than ever will be looking for information to shape their decisions. India is in the last phase of the general elections and the vote counting will take place on June 4. The first thing that comes up on a search engine once the user enters a search query is a Wikipedia page.

To understand how Wikipedia is tackling misinformation and disinformation at such a crucial time, The Indian Express interacted with Costanza Sciubba Caniglia, anti-disinformation strategy lead at the Wikimedia Foundation, the non-profit that hosts the global website. The information on Wikipedia is created and curated by a community of over 2,65,000 volunteers from around the world. Together, they compile and share information on notable subjects citing reliable sources. Caniglia claimed that the volunteers vigilantly defend against information that does not meet the site’s policies. She also mentioned that the whole process of content moderation by Wikipedia volunteers is open and transparent.

Use of AI on Wikipedia

Costanza opined that the organisation believes that artificial intelligence (AI) should support the work of humans and not replace them. The approach to AI on Wikipedia has always been through “closed-loop” systems, in which humans are in the loop — they edit, improve, and audit the work done by AI. While all content on Wikipedia is created and curated by humans, since 2002, some volunteers have used AI and machine learning (ML) tools to support their work, especially on time-consuming and redundant tasks.

Volunteers have developed and deployed a plethora of specialised tools for patrolling Wikipedia, including bots (e.g., ClueBot NG, ST47ProxyBot), gadgets, userscripts, and extensions (e.g., Twinkle, LiveRC, Real-Time Recent Changes, Page Curation), assisted editing programs (e.g., Huggle, Vandal Fighter, AutoWikiBrowser), and web applications (Checkwiki, CopyPatrol, XTools, Global user contributions) — that help volunteers quickly identify and revert wrongful edits. The Wikimedia Foundation has a team creating a new generation of AI models to increase the capacity of volunteers.

Detection of vandalism

When a user account or IP address repeatedly violates Wikipedia policies, administrators can take disciplinary action, including blocking them from further editing. Violations can include things such as repeated vandalism, undisclosed paid editing, or edit warring. One of the bots created by volunteers and active for over a decade, ClueBot NG, detects vandalism on Wikipedia. Importantly, volunteers use an interface to manually label an edit as vandalism or not. A training algorithm then uses that data to help identify new edits suspected of vandalism and reverts them.

Measures to prevent biased editing or vandalism on politically sensitive pages, especially during elections.

The most fundamental pillar of Wikipedia is that it must be written from a neutral point of view. This means that all encyclopaedic content on Wikipedia must be attributed to a reliable source and be presented fairly, proportionately, and, as far as possible, without editorial bias.

During events of global importance, such as elections, Wikipedia editors will often take steps to ensure that information about relevant topics stays as reliable as possible. Some of the measures are as follows: Administrators on Wikipedia can temporarily ‘protect’ a page from modification by less-experienced and new users. Experienced editors also use ‘watchlists’ to keep track of pages that they are interested in, helping them to quickly identify and respond to mis/disinformation attempts on those topic articles. This feature is particularly useful at times when certain pages get a high number of new edits. Wikipedia’s volunteer-based arbitration committee has built specific rules to tackle some contentious topic areas that are more prone to persistent disruptive editing than others. These rules provide more latitude to Wikipedia administrators in enforcing Wikipedia’s guidelines and taking action more promptly to mitigate and prevent disruptive edits. Like all other activities on Wikipedia, the arbitration enforcement actions are logged publicly and can be appealed through community channels. More than 140 volunteer Wikipedia editors are part of WikiProject Elections and Referendums, established in 2009 to standardise and improve Wikipedia’s coverage of elections. Volunteers work together to establish a list of reliable or unreliable sources that is visible to all, and that can be used to ensure that the information shared comes from a trustworthy source. For this election year, the Foundation has also developed a cross-functional working group of staff across the organisation to prepare for risks and support volunteers through a dedicated communication channel.

Challenges using AI or ML to safeguard Wikipedia content during past elections

Around the 2020 US Presidential elections, the Foundation, in collaboration with multiple universities around the world, delivered a suite of new research projects that examined how disinformation could manifest on the site. The insights from the research led to the product development of new human-centred machine-learning services that enhance the volunteer community’s oversight of the projects. These algorithms supported editors in tasks such as detecting unsourced statements on Wikipedia and identifying malicious edits and behaviour trends. More than 56,000 Wikipedia volunteers protected about 2,000 election-related pages at all hours. The main US Election article saw just 33 reverted edits during the same time frame — a testament to the community’s preparedness and the defences they put in place.

Future developments

Costanza said that the Wikimedia Foundation’s mission is to provide free knowledge to all and to fulfil this mission, they recognise the need to adapt to new trends in the ways that people are searching for information and participating in knowledge creation. In this spirit, the Foundation is exploring new ways to meet people’s knowledge needs in responsible ways including with ‘generative AI platforms’.


Written By

Jiri Bílek

In the vast realm of AI and U.N. directives, Jiri crafts tales that bridge tech divides. With every word, he champions a world where machines serve all, harmoniously.