Combining crowdsourcing and learning to improve engagement and performance.

Dontcheva, Mira, et al. “Combining crowdsourcing and learning to improve engagement and performance.” Proceedings of the 32nd annual ACM conference on Human factors in computing systems. ACM, 2014.

Discussion Leader (Pro): Sanchit

Summary

This paper presented a crowdsourcing platform called LevelUp For Photoshop. This tool helps workers learn Photoshop skills and tools through a series of tutorials and then allows them to apply these skills to real world image examples from several non-profit organizations that require image touchups before uploading the images for use.

This sort of crowdsourcing platform is different in that it is aimed at completing creative tasks through the crowd but also allowing the crowd to learn a valuable skill that they can apply to other fields and scenarios outside of this crowdsourcing platform. The platform starts off every user with a series of very interactive and step-by-step guiding tutorials. These tutorials are implemented as an extension for Adobe Photoshop which allows the extension to monitor what tools and actions the users have taken. This creates a very easy-to-use and learn-from tutorial system because every action has some sort of feedback associated. The only thing this tool can’t do is judge the quality of the transformations of these images. That task however is extended onto other Amazon MTurk workers who look at a before/after set of images to determine the quality and usefulness of the picture editing job done by a crowd worker in LevelUp.

This paper presented a very thorough and detailed evaluation and study of this project. It involved 3 deployments where each contribution of the approach was added onto the plugin for user testing. The first deployment was only of the interactive tutorial. The authors measured the number of levels the players completed and got helpful feedback and opinions about the tutorial system. The second deployment added the challenge mode and evaluated the results with logs, MTurk worker quality checks and expert quality examination. These photo edits were scored using a point system between 1-3 for usefulness and novelty. The last deployment added real images from non-profit organizations. The test was to determine whether different organizations have a different effect on a user’s editing motivation and skills. The results weren’t as spectacular, but they were still positive in that the skills learned by the users were helpful.

Reflection

Usually crowdsourcing involves menial tasks that have little to no value outside of the platform service, but the authors in this paper designed a very unique and impressive methodology for users to both learn a new and useful skill like photo editing and then applying the skills to complete existing real-world photo editing tasks. They took advantage of the need for certain people to learn Photoshop or image editing and while teaching them were also able to accomplish a real-life photo editing task, thus killing two birds with one stone. Crowdsourcing doesn’t necessarily have to involve monotonous tasks and nor do crowd workers have to be paid monetarily. This is a creative approach where the incentive is the teaching and skills developed for photo editing along with having achievements and badges for completing specific tasks. They may not be as valuable as money, but it is enough incentive to garner interest and leverage the newly learned skills to accomplish an existing task.

The authors conducted extremely detailed surveys and collected feedback from a pool of approximately ten thousand works over a period of 3 years. This type of dedication for evaluation and the associated results of this study prove the usefulness of this type of crowdsourcing platform. It shows that not all crowd work has to be menial tasks and that users can actually learn a new skill and apply the work outside of crowd platforms. However, I do admit that the way these results were presented were non-trivial. The inclusion of graphs, charts or tables would have made it easier to follow along instead of interpreting the numerous percentages within the paragraphs.

By having MTurk workers and experts judge photo edits, they bring in the perspective of an average user and what their perception of quality or usefulness is and they also bring in the perspective of a professional to see how quality or usefulness is judged through their eyes. That, in my opinion, is a pretty strong group of evaluators for such a project especially considering the massive scale at which people volunteered and completed these evaluation tasks on MTurk.

Lastly, I was really impressed by the Photoshop extension that the authors developed. It looked very clean, sleek and easy to learn from because it doesn’t seem to intimidate users like the palette of tools that Photoshop presents. This sleekness can allow workers to retain the skills learned and apply it to future projects that they may have. I think photo editing is a fabulous skill to have for anyone. You can edit your photos to focus or highlight different areas of the pictures or to remove unwanted noise or extraneous subjects from an image. By having a straightforward, step-by-step and interactive tool such as LevelUp, one can really increase their Photoshop skillset by a huge margin.

Questions

  • How many of you have edited pictures on Photoshop and are “decent” at it? How many would like to increase your skills and try out a learning tool like this?
  • Having a great tutorial is a necessity for such a concept to work where people can both learn and apply those skills elsewhere without their hands being held. What features do you think such tutorials should have to make them successful?

Leave a Reply

Your email address will not be published. Required fields are marked *