Summary:
“The Work of Sustaining Order in Wikipedia: The Banning of a Vandal” by Geiger and Ribes examines the role of software tools in the English Wikipedia, specifically involving autonomous and assisted editing. Wikipedia is a “free online encyclopedia, created and edited by volunteers around the world and hosted by the Wikimedia Foundation.” Bots are “fully-automated software agents that perform algorithmically-defined tasks involved with editing, maintenance, and administration in Wikipedia.” Different bots have different functions which can range from simple tasks like correcting grammatical errors to more complicated tasks like detecting personal insults. The authors present a detailed case study: “The Banning of a Vandal”. The authors talk about “Huggle”, that is the most widely used editing tool across Wikipedia that queues all the edits. The user then has the option to perform a variety of actions like ‘revert’, ‘warn’, etc. on each of the edits that is displayed. The user does not have the option to select which edit he wants to make changes to. An anonymous user had been vandalizing multiple Wikipedia pages and was not discouraged by the warning and comments given by the moderators. Eventually, this rogue user was blocked by making use of the network of moderators or vandal fighters and the bots but it was more cumbersome than expected. In addition to the quantitative and the qualitative studies, the research also demonstrated the importance of trace ethnography for studying such sociotechnical systems.
Reflections:
This is an interesting work. It was particularly insightful as I was unaware of the role of multiple bots in Wikipedia editing. Bots and humans working cohesively have helped make Wikipedia the widely used resource it currently is. Making Wikipedia a free resource that allows editing by volunteers comes with a cost. This paper helped highlight the limitations of the Wikipedia bots and how a significant amount of effort is needed from multiple moderators to ban a vandal from Wikipedia. Each moderator makes a local judgement but the Wikipedia talk pages help keep a record of all the warnings against a particular user. Certain kinds of vandalism, like inserting obscenities and profanities, are easy to detect. However, if a vandal deletes an important section from the Wikipedia page, that might involve significant cognitive effort from moderators to identify and rectify. An interesting question is how would Wikipedia be effected, if it made use of a completely automated bot instead of the hybrid system it currently uses. Would the bots be able to determine the significance of an edit or a change? How would that change the moderators behaviors and actions? Since, automated tools help determine the kind of social activities that are possible on Wikipedia, will having a completely automated bot significantly alter Wikipedia and the user involvement? It would also be interesting to see if we can use trace ethnography to study Reddit, which is another big sociotechnical system.
Questions:
1. How did such a network come into place?
2. Do you think certain kinds of Wikipedia pages are more susceptible than others to vandalism?
3. Will completely automated bots help?
4. Can we conduct such a case study for Reddit? Why? Why not?