A new study revealed a rather interesting fact. Do you know who are the harshest critics of a Wikipedia automated software bot? The other Wikipedia bots. And these silent and unseen but relentless battles seem to have been going on for years.
Wikipedia is one of the most popular websites on the Internet. According to the Free Encyclopedia itself, it ranks fifth on the Alexa top 100 websites list. The webpage gathered some 40 million entries since it was launched back in 2001.
Now, a recently released study points out an unknown fact about Wikipedia. Or more exactly, about its automated software bots. The analysis was carried out by a team of University of Oxford researchers. They were led by Taha Yasseri.
Analysis reports were released last week. A paper on the matter was published in the PLOS One journal. Available online since February 23rd, the study was titled as follows. “Even good bots fight: The case of Wikipedia”.
The analysis was determined by a recent trend. Over these past few years, the number of online bots marked a significant increase. But according to the report, little is known about their interactions with one another. As such, the researchers took to studying some of the oldest existing such technology.
They turned to the Wikipedia bots and studied ten years of their ‘life’. Research tracked the Wikipedia edits from 2001 to 2010. The aforementioned Wikipedia bots are editing algorithms. Powered by AI or artificial intelligence, they seem to be having a hard time getting along. At least according to this study.
The bots have quite clear tasks. They are in charge of modifying the Wikipedia content. And they do so by checking the spelling and also creating links. Wikipedia bots also automatically import new content. And they are also enforcing bans and undoing vandalism, where needed.
Wikipedia bots are designed to support and cooperate with human users and editors. The bots’ early activity was quite low. But as the platform and the bots’ software increased, so did their activity.
In 2014, a reported 15 percent of all the edits make across all the languages came from the bots. And the Wikipedia bots only make up for around 0.1 percent of all the registered online encyclopedia editors.
Wikipedia bots are developed so as to be benevolent with humans. But their relation to their kind is seemingly tenser. The researchers were surprised when they detected ongoing conflicts between bots. And these conflicts lasted over years.
Two bots seemed to have the hardest time getting along. Darknessbot and Xqbot clashed over about 3,600 articles in between 2009 to 2010. They reverted each other’s edits. And this was noted across a wide range of domains. Russbot and Tachikoma were also seen to clash. They modified over 1,000 of the other bot’s edits.
Yasseri stated that these clashes came as a surprise. Reportedly, the researchers were not expecting such surprises. Especially so as the Wikipedia bots were not intended to enter into conflicts with each other. Still, they seem to have been accidentally caught up in loops.
In which cases, their programming unwillingly determined this editorial combat. Yasseri states that these are good bots, with good intentions. And they are also developed based on the same open-source technology.
The analysis revealed yet another surprising fact. These bot conflicts played out differently in accordance with the language. The Portuguese page edition saw the highest numbers of bot to bot edits. In contrast, the German edition saw the smallest number. The English language edits fell somewhere in the middle.
Following the surprising results, the researchers pointed out the following. Wikipedia bots behave differently across the various cultural environments. And their conflicts are also different when compared to those in between human editors.
According to the scientists, these facts can have various implications. For example, they may point out a new factor when designing the AI agents. And they also draw attention to the need for more bots related sociology research.
Image Source: Wikimedia