A Comparison of Social, Learning, and Financial Strategies on Crowd Engagement and Output Quality

L. Yu, P. André, A. Kittur, and R. Kraut, “A Comparison of Social, Learning, and Financial Strategies on Crowd Engagement and Output Quality,” in Proceedings of the 17th ACM Conference on Computer Supported Cooperative Work & Social Computing, New York, NY, USA, 2014, pp. 967–978.

Discussion leader: Will Ellis

Summary

In this paper, Yu et al. describe three experiments they ran to test whether accepted human resource management strategies can be employed individually or in combination to improve crowdworker engagement and output quality. In motivating their research, they describe how crowd platform features aimed at lowering transaction costs work at cross-purposes with worker retention (which they view as equivalent to engagement). These “features” include simplified work histories, de-identification of workers, and lack of long-term contracts. The strategies the authors employ to mitigate the shortcomings of these features are social (through teambuilding and worker interaction), learning (through performance feedback), and financial (through long-term rewards based on quality).

The broad arc of each experiment is 1) recruit workers for an article summarization task, 2) attempt to retain workers for a similar follow-up task through recruitment messages employing zero (control), one, two, or three strategies, 3) measure worker retention and output quality, and 4) repeat steps 2 and 3. The first experiment tested employing all three strategies versus a control. The results showed that using these strategies together improved retention and quality by a statistically significant amount. The second experiment tested each strategy individually as well as pairs of strategies versus control. The results showed that only the social strategy significantly improved worker retention, while all three individual strategies improved output quality. However, no two-strategy combination significantly improved retention or quality. The authors view this as a negative interaction between the pairs of strategies and offer a few possible explanations for these outcomes, one of which is that they needed to put more effort into integrating the strategies. This led them to develop the integrated strategies that they tested in Experiment 3. In Experiment 3, the authors again tested each strategy individually. They also tested their improved strategy pair treatments, as well as a more integrated 3-strategy treatment. Again, only the social strategy by itself showed significant improvement in retention. Whereas Experiment 1 showed significant improvement in retention when employing all 3 strategies, the results of this experiment suggest otherwise. In addition, only the learning strategy by itself and the 3-strategy treatment showed improved output quality.

The authors conclude that the social strategy is the most effective way of increasing retention in crowdworkers and that the learning strategy is the most effective way of increasing output quality. The authors say these results suggest that multiple strategies undermine each other when employed together and that careful design is needed when devising crowdwork systems that try to do so.

Reflection

Yu et al. have taken an interesting tack in trying to apply perhaps more traditional human resource strategies to crowdworking in an attempt to improve worker engagement and output quality. I think they’re correct in identifying the very qualities of systems like Amazon Mechanical Turk – extremely short-term contracts, pseudonymous identities, simplified worker histories – as what make it difficult to employ such strategies. I appreciate their data-driven approach to measuring engagement, that is, they measure it through worker retention. However, I can’t help but question their equivocation of worker retention with worker engagement. Engagement implies a mental investment in the work, whereas worker retention is a measure of who was motivated to come back for tasks, regardless of their investment in the work.

A fascinating aspect of their experimental setup is that in experiments 1 and 2, the social strategy employed claimed team involvement in the follow-up recruitment materials but did not actually implement them. Despite this fact, retention benefits were clearly realized with the social strategy in experiment 2 and likely contributed to improved retention in experiment 1. Further, even though actual social collaboration was implemented in experiment 3, no further retention improvements were realized. It seems the idea of camaraderie is just as motivating as actual collaboration. The authors suggest that experiencing conflict with real teammates may mitigate the benefits to retention of teammate interaction. Indeed, this may be the place where “retention” as a substitute for “engagement” breaks down. That is, in traditional workplaces, workers engage not just with their work but also with each other. I imagine it’s much more difficult to feel engaged with pseudonymous teammates over the Internet than teammates in person.

Disappointingly, the authors cannot claim much about combined strategies. While a 3-strategy approach is consistently better in terms of quality between experiment 1 and experiment 3, none of the strategy pairs improve retention or quality significantly. They can only recommend that, when combining strategies, designers of crowdwork systems do so carefully. I would hope that future work explores in more depth what factors are at play that undermine strategy combinations.

Questions

  • Do you think worker retention is a good measure of engagement?
  • In experiment 1, the authors did not actually operationalize the learning strategy. What, if anything, do their results say about learning strategy in this experiment?
  • What do you think of the reasons the authors give for why strategy combinations perform worse than individual strategies? What changes to their experimental setup could you suggest to improve outcomes?
  • This paper is very focused on improving outcomes for work requesters using HR management strategies. Is there any benefit to crowdworkers in recreating traditional HR management structures on top of crowdwork systems?

Leave a Reply

Your email address will not be published. Required fields are marked *