Crowdsourcing

Why should you crowdsource online?

  • Online crowdsourcing platforms allow easy access to hundreds of thousands of volunteer participants or workers from around the globe to advance your research
  • Online crowdsourcing platforms substantially decrease the time it takes to complete manual, time-consuming tasks related to your research, such as text transcription, image classification, information/data checking, etc.
  • Online crowdsourcing platforms allow you to gather a diversity of perspectives and samples for survey research projects and/or research “microtasks”

What can I use crowdsourcing for?

Microtasking. Using crowdsourcing for microtaskings involves having workers on a crowdsourcing platform complete various research tasks for you that require no prior training such as text transcription, image classification, information checking, etc.

Survey/Experimental Research. Using crowdsourcing for survey/experimental research most often involves setting up a survey through an external site (e.g., Qualtrics or Survey Monkey) and linking your survey to a crowdsourcing platform (e.g., MTurk, Prolific) to recruit participants from around the globe. Many platforms also offer pre-screening based on participant characteristics. If you do use crowdsourcing for survey/experimental research, make sure to pay attention to the sampling methodology that each platform uses - many of them use convenience sampling.

Tools

Before you start any crowdsourcing task, you should always do your due diligence and ensure that the platform you select is appropriate for the type of crowdsourcing work you are hoping to accomplish. Worker/participant performance can be distinct across different crowdsourcing platforms. We have a list of some articles below, which compare and contrast various crowdsourcing platforms.

Zooniverse

Zooniverse is the world’s largest platform for volunteer crowdsourced research.  It is built and maintained by the University of Oxford, the Adler Planetarium, and the University of Minnesota.   

Pros: Ideal for classification microtasks such as image identification and text transcription.  Project-building interface is easy to use. Large volunteer user base. Users can make classifications via a phone application or website.  Forums allow researchers to connect with their volunteers. Option to incorporate machine learning in classifications.

Cons: You cannot use Zooniverse for survey research if you are required to compensate your participants; there is no option to pay participants. 

Price: Free 

Website: www.zooniverse.org

Amazon MTurk

MTurk (Amazon Mechanical Turk) is the archetypal crowdsourcing platform, originally designed with the idea that “there are still many things that human beings can do much more effectively than computers” (www.mturk.com/). 

Pros: Large worker/participant base. Provides a cost-effective method for recruiting participants for survey & online experimental research. Has templates for many different microtasks already built into the platform. 

Cons: MTurk was originally designed for microtasking, not as a tool for academic research, and you might sense that as you move through the MTurk platform. MTurk workers tend to have pretty low naivety when it comes to participation in survey research. It is difficult to do anything complex in MTurk without being a bit tech-savvy. 

Price: Price is set by you. General guideline is at least $0.10 per minute or minimum wage. Underpaying workers has been previously shown to increase dropout rate (Liu & Sundar, 2018). 

Website: www.mturk.com

CloudResearch (TurkPrime)

CloudResearch (previously TurkPrime) uses Amazon MTurk as its underlying platform but allows your studies to be controlled with greater specificity and flexibility. Specifically, CloudResearch (TurkPrime) has added features that are helpful for those doing research in social and behavioral sciences.  

Pros: On top of the pros of MTurk, the Graphic User Interface (GUI) is much more intuitive than MTurk’s GUI. You can also pre-screen for many different participant demographics and characteristics through their “Prime Panels” feature. 

Cons: You are required to have three different accounts to use CloudResearch w/ MTurk: 1) MTurk, 2) Amazon Web Service, and 3) CloudResearch, which makes initial set-up of your account difficult. CloudResearch only allows links; participants cannot complete tasks within the platform itself like they can with MTurk and Zooniverse.

Price: Price is set by you. General guideline is at least $0.10 per minute or minimum wage. CloudResearch (TurkPrime) sometimes requires additional fees on top of what you would typically pay for MTurk; information on those additional fees can be found here

Website: www.cloudresearch.com

Prolific

Prolific was designed with academic researchers in mind. While it is marketed as a survey research platform, it can be used for microtasking as well if you are creative enough and design your microtask on a survey platform. 

Pros: Prolific has one of the best/most complex participant pre-screening tools that we’ve seen out there, which is especially beneficial for survey & online experimental research. The screening tool comes with no additional cost. It is also an extremely straightforward platform to use as a requester. Participants are less exposed to common research tasks on Prolific than MTurk (Peer et al., 2017). 

Cons: Prolific is one of the more expensive options, and it doesn’t seem to pose too much advantage compared to MTurk in terms of quality of data collected for survey research (Peer et al., 2017). Prolific only allows links; participants cannot complete tasks within the platform itself like they can with MTurk and Zooniverse. 

Price: Minimum of $6.50/hr per participant.

Website: www.prolific.co

Articles Comparing Crowdsourcing Sites

Peer, E., Brandimarte, L., Samat, S., & Acquisti, A. (2017). Beyond the Turk: Alternative platforms for crowdsourcing behavioral research. Journal of Experimental Social Psychology, 70, 153-163. https://doi.org/10.1016/j.jesp.2017.01.006

Palan, S., & Schitter, C. (2018). Prolific. ac—A subject pool for online experiments. Journal of Behavioral and Experimental Finance, 17, 22-27. https://doi.org/10.1016/j.jbef.2017.12.004

Litman, L., Robinson, J., & Abberbock, T. (2017). TurkPrime.com: A versatile crowdsourcing data acquisition platform for the behavioral sciences. Behavior research methods, 49(2), 433-442. https://doi.org/10.3758/s13428-016-0727-z

Vakharia, Donna & Lease, Matthew. (2015). Beyond Mechanical Turk: An Analysis of Paid Crowd Work Platforms.

References

Liu, B., & Sundar, S. S. (2018). Microworkers as research participants: Does underpaying Turkers lead to cognitive dissonance? Computers in Human Behavior, 88, 61-69. https://doi.org/10.1016/j.chb.2018.06.017

Go Top
css.php

Contact Information