Summary:
In this paper, the authors analyze and compare different crowd work platforms. They comment that research into such platforms has been limited to Mechanical Turk and their study wishes to encompass more of them.
They compare seven AMT alternatives namely ClickWorker, CloudFactory, CrowdComputing Systems, CrowdFlower, CrowdSource, MobileWorks, and oDesk. They evaluate the platforms on 12 different metrics to answer the high-level concerns of quality control, poor worker management tools, missing fraud prevention measures and lack of automated tools. The paper also distinctly provides a need from requesters to employ their own specialized workers through such platforms and apply their own management systems and workflows.
The analysis shows diversity of these platforms and identifies some commonalities such as “peer review, qualification tests, leaderboards, etc.” and also some contrastive features such as “automated methods, task availability on mobiles, ethical worker treatment, etc.”
Reflection:
The paper provides great evaluation metrics to judge aspects of a crowd work platform. The suggested workflow interfaces and tools can greatly streamline the process for requesters and workers. However, I don’t think these crowd work platforms are businesses. Hence, incentive is required to invest in such additional processes. In the case of MT, the competitors do not have enough market share to promote viability of additional streamline processes. I think as the processes become more complex, requesters will be limited by the current framework and a market opportunity will force the platforms to evolve by integrating the processes mentioned in the paper. This will be a natural progress based on traditional development cycles.
I am sure a large company like Amazon definitely has the resources and technical skills to lead such a maneuver for MT and other platforms will follow suit. But the most important aspect for change would be a market stimulus driven by necessity and not just desire. Currently, the responsibility falls on the requester because the requirement for the processes is rare.
Also, the paper only analyzes from a requester perspective. Currently, the worker is just a de-humanized number but adding such workflows may lead to discrimination between geographical regions or distrust in a worker’s declared skill sets. This will bring the real-world challenges in the “virtual workplace” and more often lead to challenging work conditions for remote workers. This condition might also lead to worrisome exclusivity which the current platforms avoid really well. However, I believe user checks and fraud networks in the system are areas that the platforms should really focus to improve user experience for requesters.
I think a different version of the service should be provided to corporations who need workflow management and expert help. For quality control, I believe the research community should investigate globally applicable efficient processes for these areas.
Questions:
- How big is the market share of Mechanical Turk compared to other competitors?
- Does Mechanical Turk need to take a lead in crowd work reforms?
- Is the difference between platforms due to the kind of crowd work they support? If so, which type of work has better worker conditions?
- How difficult is for MT integrate the quality controls and other challenges mentioned in the paper?