Summary
In this paper, the authors study the social roles of editing tools in Wikipedia and the way vandalism fighting is addressed. The authors focus on the effected automated tools, like robots, and assisted editing tools on the distributed editing used by the encyclopedia. Wikipedia allows anyone in the universe to edit the content of its articles, which make keeping the quality of the content a difficult task. The platform depends on distributed social network of volunteers to approve or deny changes. Wikipedia uses a source control system to help the users see the changes. The source control shows both versions of the edited content side by side which allow the editor to see the change history. The authors mention that Wikipedia uses bots and automated scripts to help editing some content and fight vandalism. They also mentioned different tools used by the platform to assist the editing process. A combination of humans, automated tasks, and assisted edit tools make Wikipedia able to handle such massive number of edits and fight vandalism attempts. Most research papers that studied the editing process are outdated since they didn’t pay a close attention to these tools, while the authors highlights the importance of these tools on improving the overall quality of the content and allow more edits to be performed. These technological tools like bots and assisted editing tools changed the way humans interact with system and have a significant social effect on the types of activities that are made possible in Wikipedia.
Reflection
I found the idea of the distributed editing and vandalism fighting in Wikipedia interesting. Giving the massive amount of contents in Wikipedia, it is very challenging to keep high quality contents giving that anyone in the universe who has access to the internet can make edit. The internal source control and the assisted tools used to help the editing job at a scale are amazing.
I also found the usage of the bots to automate the edit for some content interesting. These automated scripts can help expediting the content refresh in Wikipedia, but also cause errors. Some tools mentioned in the paper don’t even show the bots changes, so I am not sure if there some method that can measure the accuracy f these bots.
The concept of distributed editing is similar to the concept of pull request in GitHub where any one can submit a change to an open source project and only group of system owners or administrator can accept or reject the changes.
Questions
- Since millions or billions of people have smart phones nowadays, the amount of anonymous edit might significantly increase. Are these tools still efficient in handling such increased volume of edits?
- Can we use deep learning or machine learning in fighting vandalism or spams? The number of edits performed on articles can be treated as a rich training dataset.
- Why don’t Wikipedia combine all the assisted editing tools in to one too that has the best of each tool? Do you think this a good idea or more tools means more innovation and more choices?