01/29/20 – Sushmethaa Muhundan – Beyond Mechanical Turk: An Analysis of Paid Crowd Work Platforms

In 2005, Amazon launched an online crowd work platform named Mechanical Turk which was one of the first of its kind and it gained momentum in this space. However, numerous other platforms have come up offering the same service since then. A vast majority of researchers in the field of crowd work have concentrated their efforts on Mechanical Turk and often ignore the alternative feature sets and workflow models provided by these other platforms. This paper deviates from this pattern and gives a qualitative comparison of 7 other crowd work platforms. The intent is to help enrich research diversity and accelerate progress by moving beyond MTurk. Since there has been a lot of work inspired by the short-comings of MTurk, the broader perspective is often lost and the alternate platforms are often ignored. This paper covers the following platforms: ClickWorker, CloudFactory, CrowdComputing Systems, CrowdFlower, CrowdSource, MobileWorks, and oDesk.

I feel that this paper encompasses different types of crowd work platforms and provides a holistic view as opposed to just focusing on one platform. The different dimensions used to compare the platforms give us an overview of the differentiating features each platform provides. I agree with the author in that research on crowd work would benefit from diversifying its lens of crowd work. This paper would be a good starting point from that perspective.

Having only been exposed to MTurk and its limitations thus far, I was pleasantly surprised to note that many platforms offer peer reviews, plagiarism checks, and feedback. This not only helps ensure a high quality of work but also provides a means for workers to validate their work and improve. Opportunities are provided to enhance the skill set of workers by providing a variety of resources to train them like certifications and training modules. Badges are used to display the workers’ skill set. This helps promote the worker’s profile as well as helps the worker grow professionally. Many platforms display work histories, test scores, and areas of interest that guide requesters in choosing workers who match their selection criteria. A few platforms maintain payroll and provide bonuses for high performing workers. This keeps the workers motivated to deliver high-quality results. 

I really liked the fact that a few platforms are using automation to complete mundane tasks thereby eliminating the need for human workers to do these tasks. These platforms identify tasks that can be handled by automated algorithms, use machine automated workers for these tasks and use human judgment for the rest. This increases productivity and enables faster completion times.

  • How can the platforms learn from each other and incorporate the best practices so that they can provide a platform that motivates the workers to perform well as well as helps requesters find workers with the necessary skillset efficiently?
  • What are some ways we can think of that permits access to pull identities for greater credibility and hide identities when not desired? Is there a way a middle ground can be achieved?
  • Since learning is an extremely important factor that would benefit both the workers (professional growth) and requesters (workers are better equipped to handle work), how can we ensure that due importance is given to this aspect by all platforms?

One thought on “01/29/20 – Sushmethaa Muhundan – Beyond Mechanical Turk: An Analysis of Paid Crowd Work Platforms

  1. If platforms use automation to solve mundane and repetitive tasks, can we still call this “crowd sourcing”? Is there a need to categorize such hybrid platforms differently?

Leave a Reply