Wikipedia bots

Internet bots that perform tasks in Wikipedia From Wikipedia, the free encyclopedia

Wikipedia bots

Wikipedia bots are Internet bots (computer programs) that perform simple, repetitive tasks on Wikipedia. One prominent example of an internet bot used in Wikipedia is Lsjbot, which has generated millions of short articles across various language editions of Wikipedia.[1]

Thumb
Bots are computer scripts that operate in an automated or semi-automated way and can perform certain actions more efficiently than humans.

Activities

Summarize
Perspective

Computer programs, called bots, have often been used to automate simple and repetitive tasks, such as correcting common misspellings and stylistic issues, or to start articles, such as geography entries, in a standard format from statistical data.[2][3][4] Additionally, there are bots designed to automatically notify editors when they make common editing errors (such as unmatched quotes or unmatched parentheses).[5]

Anti-vandalism bots like ClueBot NG, created in 2010 are programmed to detect and revert vandalism quickly.[3] Bots are able to indicate edits from particular accounts or IP address ranges, as occurred at the time of the shooting down of the MH17 jet incident in July 2014 when it was reported edits were made via IPs controlled by the Russian government.[6]

Bots on Wikipedia must be approved before activation.[7]

A bot once created up to 10,000 articles on the Swedish Wikipedia in a day.[8] According to Andrew Lih, the current expansion of Wikipedia to millions of articles would be difficult to envision without the use of such bots.[9] The Cebuano, Swedish and Waray Wikipedias are known to have high numbers of bot-created content.[10]

One notable development in recent years has been the use of bots to perform vandalism-fighting chores in place of human labor. According to recent estimates, 50% of all vandalism is already eliminated by bots. Human patrollers have congratulated the bots on their accuracy and speed in a number of remarks posted on their talk pages.[11]

Bot policy

The best method for reducing hazards without compromising functionality is Wikipedia's bot policy.[citation needed] Bots that update metatags and fix spelling "must be harmless and useful, have approval, use separate user accounts, and be operated responsibly," according to the guidelines.[7] Only once their application has been accepted by the platform and they have been publicly registered online can Wikipedia bots go live.[7]

Interactions

On Wikipedia, bots typically engage in more reciprocal and prolonged conversations than humans. However, bots in various cultural contexts may act differently, much like people. According to research, even comparatively "dumb" bots have the potential to produce complex relationships, which has important consequences for the study of artificial intelligence. Comprehending the factors that influence bot-bot interactions is essential for effective performance.[12]

Types of bots

Thumb
Icon that typically represents the bot user right on Wikipedia

One way to sort bots is by what activities they perform:[13][14]

References

Loading related searches...

Wikiwand - on

Seamless Wikipedia browsing. On steroids.