Jump to content

Amazon Mechanical Turk: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Human-subject research: added citation
 
(249 intermediate revisions by more than 100 users not shown)
Line 1: Line 1:
{{Short description|Micro-work service subsidiary of Amazon}}
{{Update|inaccurate=yes|date=July 2015}}
'''Amazon Mechanical Turk''' ('''MTurk''') is a [[crowdsourcing]] website with which businesses can hire remotely located "crowdworkers" to perform discrete on-demand tasks that computers are currently unable to do as economically. It is operated under [[Amazon Web Services]], and is owned by [[Amazon.com|Amazon]].<ref name="mturk1">{{cite web|url=https://www.mturk.com/mturk/help?helpPage=overview|access-date=14 April 2017|title=Amazon Mechanical Turk, FAQ page}}</ref> Employers, known as ''requesters,'' post jobs known as ''Human Intelligence Tasks'' (HITs), such as identifying specific content in an image or video, writing product descriptions, or answering survey questions. Workers, colloquially known as ''Turkers'' or ''crowdworkers'', browse among existing jobs and complete them in exchange for a fee set by the requester. To place jobs, requesters use an open [[application programming interface]] (API), or the more limited MTurk Requester site.<ref>{{cite web|url=http://requester.mturk.com |title=Overview &#124; Requester &#124; Amazon Mechanical Turk |publisher=Requester.mturk.com |access-date=2011-11-28}}</ref> {{As of|April 2019}}, requesters could register from 49 approved countries.<ref>{{Cite web|url=https://www.mturk.com/help|title=Amazon Mechanical Turk|website=www.mturk.com}}</ref>
{{Infobox website
| logo =
| alexa = 5,330 ({{as of|2016|09|05|alt=September 2016}})<ref name="alexa">{{cite web|url= http://www.alexa.com/siteinfo/mturk.com |title= mturk.com Site Info | publisher= [[Alexa Internet]] |accessdate= 2016-09-07 }}</ref><!--Updated monthly by OKBot.-->
| url = {{URL|www.mturk.com}}
| current status = Live
}}
'''Amazon Mechanical Turk''' ('''MTurk''') is a [[crowdsourcing]] [[Internet]] [[marketplace]] enabling individuals and businesses (known as Requesters) to coordinate the use of human intelligence to perform tasks that computers are currently unable to do. It is one of the sites of [[Amazon Web Services]], and is owned by [[Amazon.com|Amazon]].<ref name="mturk1">{{cite web|url=https://www.mturk.com/mturk/help?helpPage=overview|accessdate=14 April 2017|title=Amazon Mechanical Turk, FAQ page}}</ref> Employers are able to post jobs known as Human Intelligence Tasks (HITs), such as choosing the best among several photographs of a storefront, writing product descriptions, or identifying performers on music CDs. Workers (called ''Providers'' in Mechanical Turk's Terms of Service, or, more colloquially, ''Turkers'') can then browse among existing jobs and complete them in exchange for a monetary payment set by the employer. To place jobs, the requesting programs use an open [[application programming interface]] (API), or the more limited MTurk Requester site.<ref>{{cite web|url=http://requester.mturk.com |title=Overview &#124; Requester &#124; Amazon Mechanical Turk |publisher=Requester.mturk.com |date= |accessdate=2011-11-28}}</ref> To submit a request for tasks to be completed through the Amazon Mechanical Turk web site, a requester must provide a billing address in one of around 30 approved countries.<ref>{{cite web|url=https://requester.mturk.com/help/faq#can_requesters_outside_us_use_mturk|title=FAQs – Help – Requester – Amazon Mechanical Turk|publisher=}}</ref>


== Overview ==
== History ==
The service was conceived by [[Venky Harinarayan]] in a U.S. patent disclosure in 2001.<ref name="uspto">Multiple sources:
The name ''Mechanical Turk'' was inspired by "[[The Turk]]", an 18th-century chess-playing [[automaton]] made by [[Wolfgang von Kempelen]] that toured Europe, beating both [[Napoleon Bonaparte]] and [[Benjamin Franklin]]. It was later revealed that this "machine" was not an automaton at all, but was in fact a human [[chess master]] hidden in the cabinet beneath the board and controlling the movements of a humanoid dummy. Likewise, the Mechanical Turk online service allows humans to help the machines of today perform tasks for which they are not suited.
*{{Cite web|url = http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.htm&r=1&f=G&l=50&s1=7,197,459.PN.&OS=PN/7,197,459&RS=PN/7,197,459/|title = Hybrid machine/human computing arrangement|date = 2001|access-date = 28 July 2016|archive-date = 12 June 2018|archive-url = https://web.archive.org/web/20180612141325/http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=/netahtml/PTO/srchnum.htm&r=1&f=G&l=50&s1=7,197,459.PN.&OS=PN/7,197,459&RS=PN/7,197,459/|url-status = dead}}
*{{US Patent|7197459}}</ref> Amazon coined the term ''artificial artificial intelligence'' for processes that outsource some parts of a computer program to humans, for those tasks carried out much faster by humans than computers. It is claimed{{By whom|date=April 2023}} that [[Jeff Bezos]] was responsible for proposing the development of Amazon's Mechanical Turk to realize this process.<ref>{{cite news|url=http://www.economist.com/node/7001738?story_id=7001738|title=Artificial artificial intelligence|newspaper=[[The Economist]] | date=2006-06-10}}</ref>


The name ''Mechanical Turk'' was inspired by "[[Mechanical Turk|The Turk]]", an 18th-century chess-playing [[automaton]] made by [[Wolfgang von Kempelen]] that toured Europe, and beat both [[Napoleon Bonaparte]] and [[Benjamin Franklin]]. It was later revealed that this "machine" was not an automaton, but a human [[chess master]] hidden in the cabinet beneath the board and controlling the movements of a humanoid dummy. Analogously, the Mechanical Turk online service uses remote human labor hidden behind a computer interface to help employers perform tasks that are not possible using a true machine.
Workers set their own hours and are not under any obligation to accept any particular task. Because workers are paid as [[Independent contractor|contractors]] rather than employees, they don't have to file forms or pay payroll taxes, and they avoid laws stipulating conditions regarding [[minimum wage]], [[overtime]], and [[workers compensation]]. However, they must report their income as self-employment income. The average wage for the multiple microtasks assigned, if performed quickly, is about one dollar an hour, with each task averaging a few cents.<ref name="utne">[http://www.utne.com/science-technology/amazon-mechanical-turk-zm0z13jfzlin.aspx "Amazon Mechanical Turk: The Digital Sweatshop" Ellen Cushing ''[[Utne Reader]'' January–February 2013:]</ref>


MTurk launched publicly on November 2, 2005. Its user base grew quickly. In early- to mid-November 2005, there were tens of thousands of jobs, all uploaded to the system by Amazon itself for some of its internal tasks that required human intelligence. HIT types expanded to include transcribing, rating, image tagging, surveys, and writing.
Requesters can ask that Workers fulfill qualifications before engaging in a task, and they can set up a test in order to verify the qualification. They can also accept or reject the result sent by the Worker, which affects the Worker's reputation. Workers can have a postal address anywhere in the world. Payment for completing tasks can be redeemed on Amazon.com via [[gift certificate]] (gift certificates are the only payment option available to international workers, apart from India) or be later transferred to a Worker's U.S. bank account. Requesters pay Amazon a 20% commission on the price of successfully completed jobs.<ref name="nyt">{{Cite web|url = http://aws.amazon.com/mturk/pricing/|title = Mturk pricing|date = 2014|accessdate = 21 July 2014|website = AWS|publisher = Amazon|last = |first = }}</ref>


In March 2007, there were reportedly more than 100,000 workers in over 100 countries.<ref name=nyt /> This increased to over 500,000 registered workers from over 190 countries in January 2011.<ref name=awsdevforum>{{cite web|title=AWS Developer Forums|url=https://forums.aws.amazon.com/thread.jspa?threadID=58891|access-date=14 November 2012}}</ref> That year, Techlist published an interactive map pinpointing the locations of 50,000 of their MTurk workers around the world.<ref name=turkermap>{{cite web|last=Tamir|first=Dahn|title=50000 Worldwide Mechanical Turk Workers|url=http://techlist.com/mturk/global-mturk-worker-map.php|publisher=techlist|access-date=September 17, 2014}}</ref> By 2018, research demonstrated that while over 100,000 workers were available on the platform at any time, only around 2,000 were actively working.<ref name="Djellel 2018">{{cite book |last1=Djellel |first1=Difallah |last2=Filatova |first2=Elena |last3=Ipeirotis |first3=Panos |title=Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining |chapter=Demographics and Dynamics of Mechanical Turk Workers |url=https://www.ipeirotis.com/wp-content/uploads/2017/12/wsdmf074-difallahA.pdf |date=2018 |pages=135–143 |doi=10.1145/3159652.3159661|isbn=9781450355810 |s2cid=22339115 }}</ref>
=== Location of Turkers ===


== Overview ==
According to a survey conducted in 2008 through one MTurk HIT, Workers are primarily located in the [[United States]]<ref>{{cite news
A user of Mechanical Turk can be either a "Worker" (contractor) or a "Requester" (employer). Workers have access to a dashboard that displays three sections: total earnings, HIT status, and HIT totals. Workers set their own hours and are not under any obligation to accept any particular task.
|url= http://behind-the-enemy-lines.blogspot.com/2008/03/mechanical-turk-demographics.html
|title=Mechanical Turk: The Demographics
|author=[[Panos Ipeirotis]]
|date=March 19, 2008
|publisher=[[New York University]]
|accessdate=2009-07-30}}</ref> with demographics generally similar to the overall Internet population in the US.<ref>{{cite news
|url=http://behind-the-enemy-lines.blogspot.com/2009/03/turker-demographics-vs-internet.html
|title=Turker Demographics vs Internet Demographics
|author=Panos Ipeirotis
|date=March 16, 2009
|publisher=[[New York University]]
|accessdate=2009-07-30}}</ref>


{{Globalize|date=April 2023}}
The same author carried out a second survey in 2010 (after the introduction of cash payments for Indian workers), which gave new and updated results on the demographics of workers.<ref>{{cite news
Amazon classifies Workers as [[Independent contractor|contractors]] rather than employees and does not pay payroll taxes. Classifying Workers as contractors allows Amazon to avoid things like [[minimum wage]], [[overtime]], and [[workers compensation]]—this is a common practice among "gig economy" platforms. Workers are legally required to report their income as self-employment income.
|url= http://www.behind-the-enemy-lines.com/2010/03/new-demographics-of-mechanical-turk.html
|title=The New Demographics of Mechanical Turk
|author=[[Panos Ipeirotis]]
|date=March 9, 2010
|publisher=[[New York University]]
|accessdate=2014-03-24}}</ref> He currently runs a website showing worker demographics that is updated hourly. It shows that approximately 80% of workers are located in the United States and 20% are located elsewhere in the world, most of whom are in India.<ref name="MTurk-tracker">{{cite web|title=MTurk Tracker|url=http://demographics.mturk-tracker.com/|website=demographics.mturk-tracker.com|accessdate=1 October 2015}}</ref>


In 2013, the average wage for the multiple microtasks assigned, if performed quickly, was about one dollar an hour, with each task averaging a few cents.<ref name="utne">[http://www.utne.com/science-technology/amazon-mechanical-turk-zm0z13jfzlin.aspx "Amazon Mechanical Turk: The Digital Sweatshop"] Ellen Cushing ''[[Utne Reader]]'' January–February 2013</ref> However, calculating people's average hourly earnings on a microtask site is extremely difficult and several sources of data show average hourly earnings in the $5–$9 per hour<ref>{{Cite journal |last=Berg |first=Janine |date=2015–2016 |title=Income Security in the On-Demand Economy: Findings and Policy Lessons from a Survey of Crowdworkers |url=https://heinonline.org/HOL/Page?handle=hein.journals/cllpj37&id=579&div=&collection= |journal=Comparative Labor Law & Policy Journal |volume=37 |pages=543}}</ref><ref name=":2">{{Cite web |last=Geiger |first=Abigail |date=2016-07-11 |title=Research in the Crowdsourcing Age, a Case Study |url=https://www.pewresearch.org/internet/2016/07/11/research-in-the-crowdsourcing-age-a-case-study/ |access-date=2023-01-09 |website=Pew Research Center: Internet, Science & Tech |language=en-US}}</ref><ref>{{Cite web |title=Amazon Mechanical Turk -Fair Crowd Work |url=http://faircrowd.work/platform/amazon-mechanical-turk/ |access-date=2023-01-09 |language=en}}</ref><ref name=":3">{{Cite journal |last1=Moss |first1=Aaron J |last2=Rosenzweig |first2=Cheskie |last3=Robinson |first3=Jonathan |last4=Jaffe |first4=Shalom Noach |last5=Litman |first5=Leib |date=2020-04-28 |title=Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages |url=https://osf.io/jbc9d |doi=10.31234/osf.io/jbc9d|s2cid=236840556 }}</ref> range among a substantial number of Workers, while the most experienced, active, and proficient workers may earn over $20 per hour.<ref>{{Cite web |date=2019-11-18 |title=MTurk is the most ethical way to recruit crowd workers. |url=https://blog.turkerview.com/writer-who-never-learned-to-drive-works-for-uber/ |access-date=2023-01-09 |website=Blog {{!}} TurkerView |language=en}}</ref>
A more recent study reports Worker demographics on over 30,000 Workers across 75 studies that have been conducted since 2013.<ref name="blog.turkprime.com">{{cite web|title=The New New Demographics on Mechanical Turk: Is there Still a Gender Gap?|url=http://blog.turkprime.com/2015/03/the-new-new-demographics-on-mechanical.html|website=TurkPrime.com|accessdate=12 March 2015}}</ref>


Workers can have a postal address anywhere in the world. Payment for completing tasks can be redeemed on Amazon.com via [[gift certificate]] (gift certificates are the only payment option available to international workers, apart from India) or can be transferred to a Worker's U.S. bank account.
== History ==
The service was initially conceived by [[Venky Harinarayan]].<ref name="uspto">{{Cite web|url = http://patft.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PALL&p=1&u=%2Fnetahtml%2FPTO%2Fsrchnum.htm&r=1&f=G&l=50&s1=7,197,459.PN.&OS=PN/7,197,459&RS=PN/7,197,459/|title = Hybrid machine/human computing arrangement|date = 2001|accessdate = 28 July 2016|website =|publisher =|last = |first = }}</ref>


Requesters can ask that Workers fulfill qualifications before engaging in a task, and they can establish a test designed to verify the qualification. They can also accept or reject the result sent by the Worker, which affects the Worker's reputation. {{As of|April 2019}}, Requesters paid Amazon a minimum 20% commission on the price of successfully completed jobs, with increased amounts for {{clarify|text=additional services|reason=quantitatively or qualitatively?|date=April 2023}}.<ref name="nyt">{{Cite web|url = https://www.mturk.com/pricing|title = Mturk pricing|date = 2019|access-date = 16 April 2019|website = AWS|publisher = Amazon}}</ref> Requesters can use the Amazon Mechanical Turk API to programmatically integrate the results of the work directly into their business processes and systems. When employers set up a job, they must specify
MTurk was launched publicly on November 2, 2005. Following its launch, the Mechanical Turk user base grew quickly. In early- to mid-November 2005, there were tens of thousands of jobs, all of them uploaded to the system by Amazon itself for some of its internal tasks that required human intelligence. HIT types have expanded to include transcribing, rating, image tagging, surveys, and writing.

In March 2007, there were reportedly more than 100,000 workers in over 100 countries.<ref name=nyt /> This increased to over registered 500,000 workers from over 190 countries in January 2011.<ref name=awsdevforum>{{cite web|title=AWS Developer Forums|url=https://forums.aws.amazon.com/thread.jspa?threadID=58891|accessdate=14 November 2012}}</ref> In the same year, Techlist published an interactive map pinpointing the locations of 50,000 of their MTurk workers around the world.<ref name=turkermap>{{cite web|last=Tamir|first=Dahn|title=50000 Worldwide Mechanical Turk Workers|url=http://techlist.com/mturk/global-mturk-worker-map.php|publisher=techlist|accessdate=September 17, 2014}}</ref>

== Description ==

=== Users ===
A user of Mechanical Turks can be either a "Worker" (contractor) or a "Requester" (employer).

Employees have access to a dashboard that displays three sections: total earnings, HIT status and HIT totals.
* Total earnings: displays the total earnings a worker has received from the realization of human intelligence tasks, the gains made from bonuses and the sum of these two.
* HIT status: displays a list of daily activities and the daily income, along with the number of visits that were submitted, approved, rejected or waiting for the given day.
* HIT totals: displays information about HIT which have been accepted or are in process (including the percentage of successes that occurred, returned or abandoned and the percentage of jobs that were approved, rejected or pending those presented).

Employers (companies or independent developers that need jobs performed) can use the Amazon Mechanical Turk API to programmatically integrate the results of that work directly into their business processes and systems. When employers set up their job, they must specify
* how much are they paying for each HIT accomplished,
* how much are they paying for each HIT accomplished,
* how many workers they want to work on each HIT,
* how many workers they want to work on each HIT,
* maximum time a worker has to work on a single task,
* the maximum time a worker has to work on a single task,
* how much time the workers have to complete the work,
* how much time the workers have to complete the work,
as well as the specific details about the job they want to be completed.
as well as the specific details about the job they want to be completed.


=== Other aspects ===
=== Location of Turkers ===
Workers have been primarily located in the United States since the platform's inception<ref>{{cite news|url= http://behind-the-enemy-lines.blogspot.com/2008/03/mechanical-turk-demographics.html |title=Mechanical Turk: The Demographics|author=Panos Ipeirotis|date=March 19, 2008|publisher=New York University|access-date=2009-07-30|author-link=Panos Ipeirotis}}</ref> with demographics generally similar to the overall Internet population in the U.S.<ref>{{cite news|url=http://behind-the-enemy-lines.blogspot.com/2009/03/turker-demographics-vs-internet.html|title=Turker Demographics vs Internet Demographics|author=Panos Ipeirotis|date=March 16, 2009|publisher=New York University|access-date=2009-07-30}}</ref> Within the U.S. workers are fairly evenly spread across states, proportional to each state’s share of the U.S. population.<ref name=":0">{{Cite book |last=Litman |first=Leib |url=https://www.worldcat.org/oclc/1180179545 |title=Conducting online research on Amazon Mechanical Turk and beyond |date=2020 |others=Jonathan Robinson |isbn=978-1-5063-9111-3 |edition=1st |location=Los Angeles |oclc=1180179545}}</ref> {{As of|2019}}, between 15 and 30 thousand people in the U.S. complete at least one HIT each month and about 4,500 new people join MTurk each month.<ref>{{Cite journal |last1=Robinson |first1=Jonathan |last2=Rosenzweig |first2=Cheskie |last3=Moss |first3=Aaron J. |last4=Litman |first4=Leib |date=2019-12-16 |editor-last=Sudzina |editor-first=Frantisek |title=Tapped out or barely tapped? Recommendations for how to harness the vast and largely unused potential of the Mechanical Turk participant pool |journal=PLOS ONE |language=en |volume=14 |issue=12 |pages=e0226394 |doi=10.1371/journal.pone.0226394 |issn=1932-6203 |pmc=6913990 |pmid=31841534|bibcode=2019PLoSO..1426394R |doi-access=free }}</ref>
;Crowd labor
Amazon Mechanical Turk provides access to a crowd-sourced market of workers that can help to complete work on an as-needed basis. For work that does not require significant task-specific training, this can contrast with the traditional costs of hiring and management of temporary staff. For users, it also allows them to select among a variety of different tasks.

;Quality management
Amazon Mechanical Turk allows more than one user to send a response to the same HIT. When a specific number of users give the same answer, the HIT is automatically approved. All data for HITs is available for viewing as soon as it is submitted, allowing Requesters to manually assess quality. Requesters are not required to accept a worker's results if they are deemed inadequate, which may lead to frustration between parties. If the result is not adequate, the job is rejected and the Requester is not required to pay.

;Price determination
Users are free to work on tasks that they find most interesting, those they like to complete or the best paid. Requesters are allowed to define the payments based on the desired balance of performance and cost-efficiency. Payments are made in cooperation with Amazon Payments.

;User qualification
Amazon Mechanical Turk allows for qualifying users before they work in their tasks using rapid tests. The qualifications can be a series of questions, performing tasks or request users to have historically responded to a minimum percentage of their HIT sent correctly. Such preliminary tests can be used by companies to get data for questions without paying anything.


Cash payments for Indian workers were introduced in 2010, which updated the demographics of workers, who however remained primarily within the United States.<ref>{{cite news|url= http://www.behind-the-enemy-lines.com/2010/03/new-demographics-of-mechanical-turk.html|title=The New Demographics of Mechanical Turk|author=Panos Ipeirotis|date=March 9, 2010|publisher=[[New York University]]|access-date=2014-03-24}}</ref> A website showing worker demographics in May 2015 showed that 80% of workers were located in the United States, with the remaining 20% located elsewhere in the world, most of whom were in India.<ref name="MTurk-tracker 2015">{{cite web|title=MTurk Tracker|url=http://demographics.mturk-tracker.com/|website=demographics.mturk-tracker.com|access-date=1 October 2015}}</ref> In May 2019, approximately 60% were in the U.S., 40% elsewhere (approximately 30% in India).<ref name="MTurk-tracker 2019">{{cite web|title=MTurk Tracker|url=http://demographics.mturk-tracker.com/|website=demographics.mturk-tracker.com|access-date=2 May 2019}}</ref> In early 2023 about 90% of workers were from the U.S. and about half of the remainder from India.<ref name="MTurk-tracker 2023">{{cite web|title=MTurk Tracker|url=https://demographics.mturk-tracker.com/#/countries/all|website=demographics.mturk-tracker.com|access-date=17 April 2023}}</ref>
;Data for Machine Learning
[[Machine Learning]] algorithms require enormous amounts of human data to be successful. Acquiring such cognitive data can be very difficult and companies such as Mturk and [[CrowdFlower]] offer means to acquire it efficiently and cheaply


== Uses ==
== Uses ==
=== Human-subject research ===
{{As of|2010|since=y}}, numerous researchers have explored the viability of Mechanical Turk to recruit subjects for social science experiments. Researchers have generally found that while samples of respondents obtained through Mechanical Turk do not perfectly match all relevant characteristics of the U.S. population, they are also not wildly misrepresentative.<ref name="mt-cc">{{cite journal | last1 = Casey | first1 = Logan | last2 = Chandler | first2 = Jesse | last3 = Levine | first3 = Adam | last4 = Proctor | first4 = Andrew| last5 = Sytolovich| first5 = Dara| year = 2017 | title = Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection | journal = SAGE Open | volume = 7 | issue = 2 | pages = 215824401771277 | doi = 10.1177/2158244017712774 | doi-access = free }}</ref><ref name="mt-lf">{{cite journal | last1 = Levay | first1 = Kevin | last2 = Freese | first2 = Jeremy | last3 = Druckman |author3-link=James N. Druckman | first3 = James| year = 2016 | title = The Demographic and Political Composition of Mechanical Turk Samples | journal = SAGE Open | volume = 6| pages = 215824401663643 | doi = 10.1177/2158244016636433 | doi-access = free }}</ref> As a result, thousands of papers that rely on data collected from Mechanical Turk workers are published each year, including hundreds in top ranked academic journals.


A challenge with using MTurk for human-subject research has been maintaining data quality. A study published in 2021 found that the types of quality control approaches used by researchers (such as checking for bots, VPN users, or workers willing to submit dishonest responses) can meaningfully influence survey results. They demonstrated this via impact on three common behavioral/mental healthcare screening tools.<ref>{{Cite journal|last1=Agley|first1=Jon|last2=Xiao|first2=Yunyu|last3=Nolan|first3=Rachael|last4=Golzarri-Arroyo|first4=Lilian|date=2021|title=Quality control questions on Amazon's Mechanical Turk (MTurk): A randomized trial of impact on the USAUDIT, PHQ-9, and GAD-7|journal=Behavior Research Methods|volume=54 |issue=2 |pages=885–897 |language=en|doi=10.3758/s13428-021-01665-8|pmid=34357539|pmc=8344397|issn=1554-3528|doi-access=free}}</ref> Even though managing data quality requires work from researchers, there is a large body of research showing how to gather high quality data from MTurk.<ref>{{Cite journal |last1=Hauser |first1=David |last2=Paolacci |first2=Gabriele |last3=Chandler |first3=Jesse J. |date=2018-09-01 |title=Common Concerns with MTurk as a Participant Pool: Evidence and Solutions |url=https://osf.io/uq45c |doi=10.31234/osf.io/uq45c|s2cid=240258666 }}
=== Applications ===
* {{Cite journal |last1=Clifford |first1=Scott |last2=Jerit |first2=Jennifer |date=2016 |title=Cheating on Political Knowledge Questions in Online Surveys: An Assessment of the Problem and Solutions |journal=Public Opinion Quarterly |language=en |volume=80 |issue=4 |pages=858–887 |doi=10.1093/poq/nfw030 |issn=0033-362X|doi-access=free }}
* {{Cite journal |last1=Hauser |first1=David J. |last2=Moss |first2=Aaron J. |last3=Rosenzweig |first3=Cheskie |last4=Jaffe |first4=Shalom N. |last5=Robinson |first5=Jonathan |last6=Litman |first6=Leib |date=2022-11-03 |title=Evaluating CloudResearch's Approved Group as a solution for problematic data quality on MTurk |journal=Behavior Research Methods |volume=55 |issue=8 |pages=3953–3964 |language=en |doi=10.3758/s13428-022-01999-x |pmid=36326997 |issn=1554-3528|doi-access=free |pmc=10700412 }}</ref><ref>{{Cite journal |last=Saravanos |first=Antonios |last2=Zervoudakis |first2=Stavros |last3=Zheng |first3=Dongnanzi |last4=Stott |first4=Neil |last5=Hawryluk |first5=Bohdan |last6=Delfino |first6=Donatella |date=2021 |editor-last=Stephanidis |editor-first=Constantine |editor2-last=Soares |editor2-first=Marcelo M. |editor3-last=Rosenzweig |editor3-first=Elizabeth |editor4-last=Marcus |editor4-first=Aaron |editor5-last=Yamamoto |editor5-first=Sakae |editor6-last=Mori |editor6-first=Hirohiko |editor7-last=Rau |editor7-first=Pei-Luen Patrick |editor8-last=Meiselwitz |editor8-first=Gabriele |editor9-last=Fang |editor9-first=Xiaowen |title=The Hidden Cost of Using Amazon Mechanical Turk for Research |url=https://link.springer.com/chapter/10.1007/978-3-030-90238-4_12 |journal=HCI International 2021 - Late Breaking Papers: Design and User Experience |language=en |location=Cham |publisher=Springer International Publishing |pages=147–164 |doi=10.1007/978-3-030-90238-4_12 |isbn=978-3-030-90238-4}}</ref> The cost of using MTurk is considerably lower than many other means of conducting surveys, so many researchers continue to use it.


The general consensus among researchers is that the service works best for recruiting a diverse sample; it is less successful with studies that require more precisely defined populations or that require a representative sample of the population as a whole.<ref name="mt-cs">{{cite journal |last1=Chandler |first1=Jesse. |last2=Shapiro |first2=Danielle |year=2016 |title=Conducting Clinical Research Using Crowdsourced Convenience Samples |url=https://www.mathematica-mpr.com/our-publications-and-findings/publications/conducting-clinical-research-using-crowdsourced-convenience-samples |journal=Annual Review of Clinical Psychology |volume=12 |pages=53–81 |doi=10.1146/annurev-clinpsy-021815-093623 |pmid=26772208 |doi-access=free}}</ref> Many papers have been published on the demographics of the MTurk population.<ref name=":0" /><ref>{{Cite journal |last1=Huff |first1=Connor |last2=Tingley |first2=Dustin |date=2015-07-01 |title="Who are these people?" Evaluating the demographic characteristics and political preferences of MTurk survey respondents |journal=Research & Politics |language=en |volume=2 |issue=3 |pages=205316801560464 |doi=10.1177/2053168015604648 |s2cid=7749084 |issn=2053-1680|doi-access=free }}</ref><ref name=":1">{{Cite journal |last1=Clifford |first1=Scott |last2=Jewell |first2=Ryan M |last3=Waggoner |first3=Philip D |date=2015-10-01 |title=Are samples drawn from Mechanical Turk valid for research on political ideology? |journal=Research & Politics |language=en |volume=2 |issue=4 |pages=205316801562207 |doi=10.1177/2053168015622072 |s2cid=146591698 |issn=2053-1680|doi-access=free }}</ref> MTurk workers tend to be younger, more educated, more liberal, and slightly less wealthy than the U.S. population overall.<ref>{{Cite journal |last1=Chandler |first1=Jesse |last2=Rosenzweig |first2=Cheskie |last3=Moss |first3=Aaron J. |last4=Robinson |first4=Jonathan |last5=Litman |first5=Leib |date=October 2019 |title=Online panels in social science research: Expanding sampling methods beyond Mechanical Turk |journal=Behavior Research Methods |language=en |volume=51 |issue=5 |pages=2022–2038 |doi=10.3758/s13428-019-01273-7 |issn=1554-3528 |pmc=6797699 |pmid=31512174}}</ref>
==== Missing persons searches ====
Since 2007, the service has been used to search for prominent missing individuals. It was first suggested during the search for [[James Kim]], but his body was found before any technical progress was made. That summer, computer scientist [[Jim Gray (computer scientist)|Jim Gray]] disappeared on his yacht and Amazon's [[Werner Vogels]], a personal friend, made arrangements for [[DigitalGlobe]], which provides satellite data for [[Google Maps]] and [[Google Earth]], to put recent photography of the [[Farallon Islands]] on Mechanical Turk. A front-page story on [[Digg]] attracted 12,000 searchers who worked with imaging professionals on the same data. The search was unsuccessful.<ref>{{cite news
|url=https://www.wired.com/techbiz/people/magazine/15-08/ff_jimgray?currentPage=5
|title=Inside the High-Tech Search for a Silicon Valley Legend
|author=Steve Silberman
|date=July 24, 2007
|publisher=[[Wired magazine]]
|accessdate=2007-09-16}}</ref>


=== Machine Learning ===
In September 2007, a similar arrangement was repeated in the [[Steve Fossett search|search for aviator Steve Fossett]]. Satellite data was divided into 85 squared meter sections, and Mechanical Turk users were asked to flag images with "foreign objects" that might be a crash site or other evidence that should be examined more closely.<ref>{{cite web|url=http://www.avweb.com/avwebflash/news/SteveFossettSearch_AmazonMechanicalTurk_PleaseHelp_196097-1.html |title=AVweb Invites You to Join the Search for Steve Fossett |publisher=Avweb.com |date= |accessdate=2011-11-28}}</ref> This search was also unsuccessful. The satellite imagery was mostly within a 50-mile radius,<ref>{{cite web|url=http://s3.amazonaws.com/fossett/index.html|title=Official Mechanical Turk Steve Fossett Results|publisher=Amazon.com|date=2007-09-24|accessdate=2012-08-14}}</ref> but the crash site was eventually found by hikers about a year later, 65 miles away.<ref>{{cite news
[[Supervised Machine Learning]] algorithms require large amounts of human-annotated data to be trained successfully. Machine learning researchers have hired Workers through Mechanical Turk to produce datasets such as SQuAD, a [[question answering]] dataset.<ref>{{cite arXiv |eprint=1606.05250|title= SQuAD: 100,000+ Questions for Machine Comprehension of Text|class= cs.CL|last1= Rajpurkar|first1= Pranav|last2= Zhang|first2= Jian|last3= Lopyrev|first3= Konstantin|last4= Liang|first4= Percy|year= 2016}}</ref>
|url=https://www.reuters.com/article/peopleNews/idUSTRE4907G820081001
|title=Hikers find Steve Fossett's ID, belongings
|author=Jim Christie
|date=October 1, 2008
|publisher=Reuters
|accessdate=2008-11-27| archiveurl= https://web.archive.org/web/20081220030716/https://www.reuters.com/article/peopleNews/idUSTRE4907G820081001| archivedate= 20 December 2008| deadurl= no}}</ref>


==== Social science experiments ====
=== Missing persons searches ===
{{As of|2007|since=y}}, the service has been used to search for prominent missing individuals. This use was first suggested during the search for [[James Kim]], but his body was found before any technical progress was made. That summer, computer scientist [[Jim Gray (computer scientist)|Jim Gray]] disappeared on his yacht and Amazon's [[Werner Vogels]], a personal friend, made arrangements for [[DigitalGlobe]], which provides satellite data for [[Google Maps]] and [[Google Earth]], to put recent photography of the [[Farallon Islands]] on Mechanical Turk. A front-page story on [[Digg]] attracted 12,000 searchers who worked with imaging professionals on the same data. The search was unsuccessful.<ref>{{cite magazine|url=https://www.wired.com/techbiz/people/magazine/15-08/ff_jimgray?currentPage=5|title=Inside the High-Tech Search for a Silicon Valley Legend|author=Steve Silberman|date=July 24, 2007|magazine=Wired magazine|access-date=2007-09-16}}</ref>
Beginning in 2010, numerous researchers have explored the viability of Mechanical Turk to recruit subjects of social science experiments. Thousands of papers that rely on data collected from Mechanical Turk workers are published each year, including hundreds in top ranked academic journals.<ref name="mt-cs">{{cite journal | last1 = Chandler | first1 = Jesse. | last2 = Shapiro | first2 = Danielle | year = 2016 | title = Conducting Clinical Research Using Crowdsourced Convenience Samples | journal = Annual Review of Clinical Psychology | url = https://www.mathematica-mpr.com/our-publications-and-findings/publications/conducting-clinical-research-using-crowdsourced-convenience-samples}}</ref> Researchers generally found that while samples of respondents obtained through Mechanical Turk do not perfectly match all relevant characteristics of the U.S. population, they're not wildly misrepresentative either.<ref name="mt-cc">{{cite journal | last1 = Casey | first1 = Logan | last2 = Chandler | first2 = Jesse | last3 = Levine | first3 = Adam | last4 = Proctor | first4 = Andrew| last5 = Sytolovich| first5 = Dara| year = 2017 | title = Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection | journal = SAGE Open | volume = 7 | issue = 2 | url = http://journals.sagepub.com/doi/full/10.1177/2158244017712774| doi = 10.1177/2158244017712774 }}</ref><ref name="mt-lf">{{cite journal | last1 = Levay | first1 = Kevin | last2 = Freese | first2 = Jeremy | last3 = Druckman | first3 = James| year = 2016 | title = The Demographic and Political Composition of Mechanical Turk Samples | journal = SAGE Open | volume = 6| url = http://journals.sagepub.com/doi/abs/10.1177/2158244016636433| doi = 10.1177/2158244016636433 }}</ref> The general consensus among researchers is that the service works best for recruiting a diverse sample; it is less successful with studies that require more precisely defined populations or that require a representative sample of the population as a whole.<ref name="mt-cs">{{cite journal | last1 = Chandler | first1 = Jesse. | last2 = Shapiro | first2 = Danielle | year = 2016 | title = Conducting Clinical Research Using Crowdsourced Convenience Samples | journal = Annual Review of Clinical Psychology | url = https://www.mathematica-mpr.com/our-publications-and-findings/publications/conducting-clinical-research-using-crowdsourced-convenience-samples}}</ref><ref name="mt-jr">{{cite news | url=http://journalistsresource.org/studies/economics/commerce/online-labor-markets-research-amazon-mechanical-turk/ | title=Evaluating Online Labor Markets for Experimental Research: Amazon.com’s Mechanical Turk}} JournalistsResource.org, retrieved June 18, 2012</ref><ref name="mt-pa">{{cite journal | last1 = Paolacci | first1 = Gabriele. | last2 = Chandler | first2 = Jesse | last3 = Ipeirotis | first3 = Panos | year = 2010 | title = Running Experiments on Amazon Mechanical Turk | journal = Judgment and Decision Making | url = http://www.sjdm.org/~baron/journal/10/10630a/jdm10630a.pdf}}</ref><ref name="mt-pb">{{cite journal | last1 = Buhrmester | first1 = Michael | last2 = Kwang | first2 = Tracy | last3 = Gosling | first3 = Sam | year = 2011 | title = Amazon's Mechanical Turk A New Source of Inexpensive, Yet High-Quality, Data? | journal = Perspectives on Psychological Science | volume = 6 | issue = 1 | pages = 3–5 | url = http://pps.sagepub.com/content/6/1/3.short | doi = 10.1177/1745691610393980| pmid = 26162106 }}</ref><ref name="mt-pc">{{cite journal | last1 = Berinsky | first1 = Adam J. | last2 = Huber | first2 = Gregory A. | last3 = Lenz | first3 = Gabriel S. | year = 2012 | title = Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk | journal = Political Analysis | volume = 20 | issue = 3 | pages = 351 | url = http://pan.oxfordjournals.org/content/early/2012/03/02/pan.mpr057.abstract | doi = 10.1093/pan/mpr057}}</ref><ref name="mt-pd">{{cite journal | last1 = Horton | first1 = John J. | last2 = Rand | first2 = David G. | last3 = Zeckhauser | first3 = Richard J. | year = 2010 | title = The Online Laboratory: Conducting Experiments in a Real Labor Market | journal = Experimental Economics | volume = 14 | issue = 3 | pages = 399 | url = https://link.springer.com/article/10.1007/s10683-011-9273-9#page-1| doi = 10.1007/s10683-011-9273-9}}</ref>
Overall, the U.S. MTurk population is mostly female and white, and is somewhat younger and more educated than the U.S. population overall. Data collected on jobs conducted since 2013 show that the U.S. population is no longer predominantly female, and that Workers are currently slightly more likely to be male.<ref name="blog.turkprime.com" /> The cost of MTurk was considerably lower than other means of conducting surveys, with workers willing to complete tasks for less than half the U.S. minimum wage.<ref name="mt-jh">{{cite journal | last1 = Horton | first1 = John | last2 = Chilton | first2 = Lydia | year = 2010 | title = The Labor Economics of Paid Crowdsourcing | journal = Proceedings of the 11th ACM conference on Electronic commerce | pages = 209 | arxiv = 1001.0627| doi = 10.1145/1807342.1807376| isbn = 978-1-60558-822-3 }}</ref>


In September 2007, a similar arrangement was repeated in the [[Steve Fossett search|search for aviator Steve Fossett]]. Satellite data was divided into {{convert|85|m2|ft2|adj=on}} sections, and Mechanical Turk users were asked to flag images with "foreign objects" that might be a crash site or other evidence that should be examined more closely.<ref>{{cite web|url=http://www.avweb.com/avwebflash/news/SteveFossettSearch_AmazonMechanicalTurk_PleaseHelp_196097-1.html |title=AVweb Invites You to Join the Search for Steve Fossett |date=8 September 2007 |publisher=Avweb.com |access-date=2011-11-28}}</ref> This search was also unsuccessful. The satellite imagery was mostly within a 50-mile radius,<ref>{{cite web|url=http://s3.amazonaws.com/fossett/index.html|title=Official Mechanical Turk Steve Fossett Results|date=2007-09-24|access-date=2012-08-14}}</ref> but the crash site was eventually found by hikers about a year later, 65 miles away.<ref>{{cite news|url=https://www.reuters.com/article/peopleNews/idUSTRE4907G820081001|title=Hikers find Steve Fossett's ID, belongings|author=Jim Christie|date=October 1, 2008|work=Reuters|access-date=2008-11-27| archive-url= https://web.archive.org/web/20081220030716/https://www.reuters.com/article/peopleNews/idUSTRE4907G820081001| archive-date= 20 December 2008| url-status= live}}</ref>
==== Artistic and educational research ====
In addition to receiving growing interest from the social sciences, MTurk has also been used as a tool for both artistic and educational exploration. Artist [[Aaron Koblin]] has made use of MTurk's [[crowdsourcing]] ability to create a number of collaborative artistic works such as ''The Sheep Market'' and ''Ten Thousand Cents''<ref>Ten Thousand Cents – Project: http://www.tenthousandcents.com/top.html</ref> which combined thousands of individual drawings of a US$100 bill.<ref>Koblin, Aaron: http://www.aaronkoblin.com/work.html</ref> The work functions as a sort of reverse [[exquisite corpse]] drawing.


=== Artistic works ===
Inspired by Koblin's collaborative artworks a [[Concordia University]] graduate research student turned to MTurk to see if the crowdsourcing technology could also be used for educational research. Scott McMaster conducted two pilot projects which used HITs to request drawings, but, in contrast to Koblin's work, the workers knew exactly what the drawings were being used for. The jobs required participants to visually represent sets of words in drawings and fill out a short demographic survey. Although the research was in its infancy, McMaster's findings suggested that a [[Globalization|globalizing]] effect is emerging within visual cultural representations. It is a published instance of this type of online research into [[visual culture]].<ref>McMaster, S. (2012). New Approaches to Image-based Research and Visual Literacy. In Avgerinou, Chandler, Search and Terzic (Eds.), New Horizons in Visual Literacy: Selected Readings of the International Visual Literacy Association (122-132). Siauliai, Lithuania: SMC Scientia Educologica: https://concordia.academia.edu/SCOTTMCMASTER</ref>
MTurk has also been used as a tool for artistic creation. One of the first artists to work with Mechanical Turk was [[xtine burrough]], with ''The Mechanical Olympics'' (2008),<ref name="Rhizome 2008">{{Cite web | url=http://rhizome.org/editorial/2008/aug/5/lets-get-physical/ |title = Let's Get Physical| date=5 August 2008 }}</ref><ref>{{Cite web | url=http://neural.it/2010/10/mechanical-games-online-sports-video-for-turkers/ |title = Mechanical Games, online sports video for turkers &#124; Neural| date=29 October 2010 }}</ref> ''Endless Om'' (2015), and ''Mediations on Digital Labor'' (2015).<ref>{{Cite web |url=http://www.ocweekly.com/2015-05-28/culture/john-spiak-grand-central-art-center-santa-ana/ |title=Jail Benches and Amazon.com at SanTana's Grand Central Art Center &#124; OC Weekly |access-date=2019-04-16 |archive-url=https://web.archive.org/web/20150906074809/http://www.ocweekly.com/2015-05-28/culture/john-spiak-grand-central-art-center-santa-ana/ |archive-date=2015-09-06 |url-status=dead }}
* Project: http://www.missconceptions.net/mediations/</ref> Another work was artist [[Aaron Koblin]]'s ''Ten Thousand Cents'' (2008).{{elucidate|reason=explain what role MTurk played in this artwork|date=April 2023}}


=== Third-party programming ===
=== Third-party programming ===
Programmers have developed various browser extensions and [[Scripting language|scripts]] designed to simplify the process of completing jobs. According to the Amazon Web Services Blog, however, Amazon appears to disapprove of the ones that completely automate the process and preclude the human element.<ref>{{cite web|url=http://aws.typepad.com/aws/2005/12/amazon_mechanic.html |title=Amazon Web Services Blog: Amazon Mechanical Turk Status Update |publisher=Aws.typepad.com |date=2005-12-06 |accessdate=2011-11-28}}</ref> Accounts using so-called automated bots have been banned. There are services that extend the capabilities to MTurk.
Programmers have developed browser extensions and [[Scripting language|scripts]] designed to simplify the process of completing jobs. Amazon has stated that they disapprove of scripts that completely automate the process and preclude the human element. This is because of the concern that the task completion process—e.g. answering a survey—could be gamed with random responses, and the resultant collected data could be worthless.<ref>{{cite web|url=http://aws.typepad.com/aws/2005/12/amazon_mechanic.html |title=Amazon Web Services Blog: Amazon Mechanical Turk Status Update |publisher=Aws.typepad.com |date=2005-12-06 |access-date=2011-11-28}}</ref> Accounts using so-called automated bots have been banned. {{clarify|text=There are services that extend the capabilities to MTurk.|date=April 2023}}


==== API ====
==== API ====
Amazon makes available an [[application programming interface]] (API) to give users another access point into the MTurk system. The MTurk API lets a programmer access numerous aspects of MTurk like submit jobs, retrieve completed work, and approve or reject that work.<ref>{{cite web|url=http://developer.amazonwebservices.com/connect/kbcategory.jspa?categoryID=28 |title=Documentation Archive : Amazon Web Services |publisher=Developer.amazonwebservices.com |date= |accessdate=2011-11-28}}</ref> In 2017, Amazon launched support for AWS Software Development Kits (SDK), allowing for nine new SDKs available to MTurk Users. MTurk is accessible via API from the following languages: Python, JavaScript, Java, .NET, Go, Ruby, PHP or C++.<ref>{{cite web|url=http://docs.aws.amazon.com/AWSMechTurk/latest/AWSMturkAPI/Welcome.html | title=Amazon Mechanical Turk API Reference |publisher=Developer.amazonwebservices.com }}</ref> Web sites and web services can use the API to integrate MTurk work into other web applications, providing users with alternatives to the interface Amazon has built for these functions.
Amazon makes available an [[application programming interface]] (API) for the MTurk system. The MTurk API lets a programmer submit jobs, retrieve completed work, and approve or reject that work.<ref>{{cite web |url=http://developer.amazonwebservices.com/connect/kbcategory.jspa?categoryID=28 |title=Documentation Archive : Amazon Web Services |publisher=Developer.amazonwebservices.com |access-date=2011-11-28 |url-status=dead |archive-url=https://web.archive.org/web/20090410032147/http://developer.amazonwebservices.com/connect/kbcategory.jspa?categoryID=28 |archive-date=2009-04-10 }}</ref> In 2017, Amazon launched support for AWS Software Development Kits (SDK), allowing for nine new SDKs available to MTurk Users.{{importance inline|date=April 2023}} MTurk is accessible via API from the following languages: Python, JavaScript, Java, .NET, Go, Ruby, PHP, or C++.<ref>{{cite web|url=http://docs.aws.amazon.com/AWSMechTurk/latest/AWSMturkAPI/Welcome.html | title=Amazon Mechanical Turk API Reference |publisher=Developer.amazonwebservices.com }}</ref> Web sites and web services can use the API to integrate MTurk work into other web applications, providing users with alternatives to the interface Amazon has built for these functions.


=== Use case examples ===
=== Use case examples ===


==== Processing photos / videos ====
==== Processing photos / videos ====
Amazon Mechanical Turk provides a platform for processing images, a task well-suited to human intelligence. Requesters have created tasks asking workers to label objects found in an image, select the most relevant picture in a group of pictures, screen inappropriate content, and classify objects in satellite images. Also, crowd workers have completed tasks of digitizing text from images such as scanned forms filled out by hand.<ref name="tr-hr">[http://www.techrepublic.com/article/inside-amazons-clickworker-platform-how-half-a-million-people-are-training-ai-for-pennies-per-task "Inside Amazon's clickworker platform: How half a million people are being paid pennies to train AI" ''[[Tech Republic]'']</ref>
Amazon Mechanical Turk provides a platform for processing images, a task well-suited to human intelligence. Requesters have created tasks that ask workers to label objects found in an image, select the most relevant picture in a group of pictures, screen inappropriate content, classify objects in satellite images, or digitize text from images such as scanned forms filled out by hand.<ref name="tr-hr">{{Cite web|url=https://www.techrepublic.com/article/inside-amazons-clickworker-platform-how-half-a-million-people-are-training-ai-for-pennies-per-task/|title=Inside Amazon's clickworker platform: How half a million people are being paid pennies to train AI|website=TechRepublic|date=16 December 2016 }}</ref>


==== Data cleaning / verification ====
==== Data cleaning / verification ====
Companies with large online catalogs use Mechanical Turk to identify duplicates and verify details of item entries. Some examples of fixing duplicates are identifying and removing duplicates in yellow pages directory listings and online product catalog entries. Examples of verifying details include checking restaurant details (e.g. phone number and hours) and finding contact information from web pages (e.g. author name and email).<ref name="utne" /><ref name="tr-hr" />
Companies with large online catalogues use Mechanical Turk to identify duplicates and verify details of item entries. For example: removing duplicates in yellow pages directory listings, checking restaurant details (e.g. phone number and hours), and finding contact information from web pages (e.g. author name and email).<ref name="utne" /><ref name="tr-hr" />


==== Information collection ====
==== Information collection ====
Diversification and scale of personnel of Mechanical Turk allow collecting an amount of information that would be difficult outside of a crowd platform. Mechanical Turk allows requesters to amass a large number of responses to various types of surveys, from basic demographics to academic research. Other uses include writing comments, descriptions and blog entries to websites and searching data elements or specific fields in large government and legal documents.<ref name="tr-hr" />
Diversification and scale of personnel of Mechanical Turk allow collecting information at a large scale, which would be difficult outside of a crowd platform. Mechanical Turk allows Requesters to amass a large number of responses to various types of surveys, from basic demographics to academic research. Other uses include writing comments, descriptions, and blog entries to websites and searching data elements or specific fields in large government and legal documents.<ref name="tr-hr" />


==== Data processing ====
==== Data processing ====
Line 132: Line 78:


== Research validity ==
== Research validity ==
The validity of research conducted with the Mechanical Turk worker pool has been questioned.<ref>{{cite web|title=Running Experiments with Amazon Mechanical Turk|url=http://mgto.org/running-experiments-with-amazon-mechanical-turk/}}</ref><ref>{{cite web|title=Can I Use Mechanical Turk (MTurk) for a Research Study?|url=http://neoacademic.com/2014/11/13/can-use-mechanical-turk-mturk-research-study/}}</ref> This is in large part due to the proprietary method that Mechanical Turk uses to select its workers.<ref>{{cite web|title=www.trustpilot.com|url=https://www.trustpilot.com/reviews/58c5b860a912c407fce890bb}}</ref> Since the method of selection is not shared with researchers, researchers can not know the true demographics of the pool of participants. It is unclear if Mechanical Turk uses fiscal, political, or educational limiters in their selection process. This may invalidate any surveys or research done using the Mechanical Turk worker pool.<ref>{{cite web|title=External Validity|url=https://explorable.com/external-validity}}</ref><ref>{{cite web|title=External Validity|url=https://www.socialresearchmethods.net/kb/external.php}}</ref>
The validity of research conducted with the Mechanical Turk worker pool has long been debated among experts.<ref>{{cite journal|title=Can I Use Mechanical Turk (MTurk) for a Research Study?|journal=Industrial and Organizational Psychology|volume=8|issue=2|url=http://neoacademic.com/2014/11/13/can-use-mechanical-turk-mturk-research-study/|year=2015|last1=Landers|first1=R. N.|last2=Behrend|first2=T. S.}}</ref> This is largely because questions of validity<ref>{{Cite web |title=External Validity - Generalizing Results in Research |url=https://explorable.com/external-validity |website=explorable.com}}
* {{Cite web |title=Social Research Methods - Knowledge Base - External Validity |url=https://www.socialresearchmethods.net/kb/external.php |website=www.socialresearchmethods.net}}</ref> are complex: they involve not only questions of whether the research methods were appropriate and whether the study was well-executed, but also questions about the goal of the project, how the researchers used MTurk, who was sampled, and what conclusions were drawn.

Most experts agree that MTurk is better suited for some types of research than others. MTurk appears well-suited for questions that seek to understand whether two or more things are related to each other (called correlational research; e.g., are happy people more healthy?) and questions that attempt to show one thing causes another thing (experimental research; e.g., being happy makes people more healthy). Fortunately, these categories capture most of the research conducted by behavioral scientists, and most correlational and experimental findings found in nationally representative samples replicate on MTurk.<ref>{{Cite journal |last1=Coppock |first1=Alexander |last2=Leeper |first2=Thomas J. |last3=Mullinix |first3=Kevin J. |date=2018-12-04 |title=Generalizability of heterogeneous treatment effect estimates across samples |journal=Proceedings of the National Academy of Sciences |language=en |volume=115 |issue=49 |pages=12441–12446 |doi=10.1073/pnas.1808083115 |issn=0027-8424 |pmc=6298071 |pmid=30446611|bibcode=2018PNAS..11512441C |doi-access=free }}</ref>

The type of research that is not well-suited for MTurk is often called "descriptive research." Descriptive research seeks to describe how or what people think, feel, or do; one example is public opinion polling. MTurk is not well-suited to such research because it does not select a representative sample of the general population. Instead, MTurk is a nonprobability,{{jargon inline|date=April 2023}} convenience sample. Descriptive research is best conducted with a probability-based, representative sample of the population researchers want to understand. When compared to the general population, people on MTurk are younger, more highly educated, more liberal, and less religious.<ref>{{Cite journal |last1=Chandler |first1=Jesse |last2=Rosenzweig |first2=Cheskie |last3=Moss |first3=Aaron J. |last4=Robinson |first4=Jonathan |last5=Litman |first5=Leib |date=2019-10-01 |title=Online panels in social science research: Expanding sampling methods beyond Mechanical Turk |url=https://doi.org/10.3758/s13428-019-01273-7 |journal=Behavior Research Methods |language=en |volume=51 |issue=5 |pages=2022–2038 |doi=10.3758/s13428-019-01273-7 |issn=1554-3528 |pmc=6797699 |pmid=31512174}}</ref><ref name=":0" /><ref name=":1" />


== Labor issues ==
== Labor issues ==
{{npov section|date=July 2023}}
Mechanical Turk has been criticized by journalists and activists for its interactions with and use of labor.
Computer scientist [[Jaron Lanier]] noted how the [[Web design|design]] of Mechanical Turk "allows you to think of the people as software components" in a way that conjures "a sense of magic, as if you can just pluck results out of the cloud at an incredibly low cost".<ref name="Lanier2014">{{cite book|author=Jaron Lanier|title=Who Owns the Future? |year=2013|publisher=Simon and Schuster|isbn=978-1-4516-5497-4|title-link=Who Owns the Future? }}</ref> A similar point is made in the book ''Ghostwork'' by Mary L. Gray and Siddharth Suri.<ref>{{Cite web |title=Ghost Work |url=https://ghostwork.info/ |access-date=2023-01-24 |website=Ghost Work |language=en-US}}</ref>{{importance inline|date=April 2023}}

Critics of MTurk argue that workers are forced onto the site by precarious economic conditions and then exploited by requesters with low wages and a lack of power when disputes occur. Journalist Alana Semuels’s article "The Internet Is Enabling a New Kind of Poorly Paid Hell" in ''The Atlantic'' is typical of such criticisms of MTurk.<ref>{{Cite web |last=Semuels |first=Alana |date=2018-01-23 |title=The Internet Is Enabling a New Kind of Poorly Paid Hell |url=https://www.theatlantic.com/business/archive/2018/01/amazon-mechanical-turk/551192/ |access-date=2023-01-24 |website=The Atlantic |language=en}}</ref>

Some{{who|date=July 2023}} academic papers have obtained findings that support or serve as the basis for such common criticisms,<ref>{{Cite journal |last1=Fort |first1=K. |last2=Adda |first2=G. |last3=Cohen |first3=K.B. |date=2011 |title=Amazon Mechanical Turk: Gold mine or coal mine? |journal=Computational Linguistics |volume=37 |issue=2 |pages=413–420|doi=10.1162/COLI_a_00057 |s2cid=1051130 |doi-access=free }}
* {{Cite journal |last=Williamson |first=Vanessa |date=January 2016 |title=On the Ethics of Crowdsourced Research |journal=PS: Political Science & Politics |language=en |volume=49 |issue=1 |pages=77–81 |doi=10.1017/S104909651500116X |doi-broken-date=1 November 2024 |s2cid=155196102 |issn=1049-0965|doi-access=free }}</ref> but others contradict them.<ref>{{Cite journal |last=Horton |first=John J. |date=2011-04-01 |title=The condition of the Turking class: Are online employers fair and honest? |url=https://www.sciencedirect.com/science/article/pii/S0165176510004398 |journal=Economics Letters |language=en |volume=111 |issue=1 |pages=10–12 |doi=10.1016/j.econlet.2010.12.007 |arxiv=1001.1172 |s2cid=37577313 |issn=0165-1765}}
* {{Cite journal |last1=Moss |first1=A.J. |last2=Rosenzweig |first2=C. |last3=Robinson |first3=J. |last4=Jaffe |first4=S. |last5=Litman |first5=L. |date=2020 |title=Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages |url=https://psyarxiv.com/jbc9d/ |journal=Behavior Research Methods}}</ref> A recent academic commentary argued that study participants on sites like MTurk should be clearly warned about the circumstances in which they might later be denied payment as a matter of ethics,<ref>{{Cite journal |last1=Agley |first1=Jon |last2=Mumaw |first2=Casey |date=2024-05-29 |title=Warning Crowdsourced Study Participants About Possible Consequences for Inattentive Participation Relates to Informed Consent, Regardless of Effects on Data Quality |url=https://newprairiepress.org/hbr/vol7/iss2/5 |journal=Health Behavior Research |language=en |volume=7 |issue=2 |doi=10.4148/2572-1836.1236 |issn=2572-1836}}</ref> even though such statements may not reduce the rate of careless responding.<ref>{{Cite journal |last1=Brühlmann |first1=Florian |last2=Memeti |first2=Zgjim |last3=Aeschbach |first3=Lena F. |last4=Perrig |first4=Sebastian A. C. |last5=Opwis |first5=Klaus |date=2024-01-18 |title=The effectiveness of warning statements in reducing careless responding in crowdsourced online surveys |journal=Behavior Research Methods |volume=56 |issue=6 |pages=5862–5875 |language=en |doi=10.3758/s13428-023-02321-z |issn=1554-3528|doi-access=free |pmid=38238528 |pmc=11335820 }}</ref>

A paper published by a team at CloudResearch<ref name=":3" /> shows that only about 7% of people on MTurk view completing HITs as something akin to a full-time job. Most people report that MTurk is a way to earn money during their leisure time or as a side gig. In 2019, the typical worker spent five to eight hours per week and earned around $7 per hour. The sampled workers did not report {{clarify|text=rampant|date=April 2023}} mistreatment at the hands of requesters; they reported trusting requesters more than employers outside of MTurk. Similar findings were presented in a review of MTurk by the Fair Crowd Work organization, a collective of crowd workers and unions.<ref>{{Cite web |title=Amazon Mechanical Turk -Fair Crowd Work |url=http://faircrowd.work/platform/amazon-mechanical-turk/ |access-date=2023-01-24 |language=en}}</ref>{{unreliable source|date=July 2023}}


=== Monetary compensation ===
=== Monetary compensation ===
The minimum payment that Amazon allows for a task is one cent. Because tasks are typically simple and repetitive the majority of tasks pay only a few cents,<ref>Ipeirotis, P. G. (2010). Analyzing the amazon mechanical turk marketplace. ''XRDS: Crossroads, The ACM magazine for students'', ''17''(2), 16-21.
Because tasks are typically simple and repetitive and users are paid often only a few cents to complete them, some have criticized Mechanical Turk for exploiting and not compensating workers for the true value of the task they complete.<ref name="ieeexplore">{{cite journal |doi=10.1109/CGC.2013.89|title=The Good, The Bad and the Ugly: Why Crowdsourcing Needs Ethics|pages=531|year=2013|last1=Schmidt|first1=Florian Alexander|isbn=978-0-7695-5114-2|journal= Cloud and Green Computing (CGC), 2013 Third International Conference on}}</ref> Computer scientist [[Jaron Lanier]] notes how the [[Web design|design]] of Mechanical Turk "allows you to think of the people as software components" that conjures "a sense of magic, as if you can just pluck results out of the cloud at an incredibly low cost."<ref name="Lanier2014">{{cite book|author=Jaron Lanier|title=[[Who Owns the Future?]] |year=2013|publisher=Simon and Schuster|isbn=978-1-4516-5497-4}}</ref> On the other hand, in one psychological study done by the University of Texas, evidence showed that many of the workers did not complete the task for monetary compensation, and instead did the work for enjoyment and self-fulfillment.<ref name="mt-pb" /> '
* {{Cite web |last=Geiger |first=Abigail |date=2016-07-11 |title=Research in the Crowdsourcing Age, a Case Study |url=https://www.pewresearch.org/internet/2016/07/11/research-in-the-crowdsourcing-age-a-case-study/ |access-date=2023-01-24 |website=Pew Research Center: Internet, Science & Tech |language=en-US}}</ref> but there are also well-paying tasks on the site.

Many criticisms of MTurk stem from the fact that a majority of tasks offer low wages. In addition, workers are considered [[Independent contracting in the United States|independent contractors]] rather than employees. Independent contractors are not protected by the [[Fair Labor Standards Act of 1938|Fair Labor Standards Act]] or other legislation that protects workers’ rights.{{globalize inline|United States|date=April 2023}} Workers on MTurk must compete with others for good HIT opportunities as well as spend time searching for tasks and other actions that they are not compensated for.

The low payment offered for many tasks has fueled criticism of Mechanical Turk for exploiting and not compensating workers for the true value of the task they complete.<ref name="ieeexplore">{{cite book |doi=10.1109/CGC.2013.89|pages=531–535|year=2013|last1=Schmidt|first1=Florian Alexander|chapter=The Good, the Bad and the Ugly: Why Crowdsourcing Needs Ethics |title=2013 International Conference on Cloud and Green Computing|isbn=978-0-7695-5114-2|s2cid=18798641}}</ref> One study of 3.8 million tasks completed by 2,767 workers showed that "workers earned a median hourly wage of about $2 an hour" with 4% of workers earning more than $7.25 per hour.<ref>{{Cite book |last1=Hara |first1=Kotaro |last2=Adams |first2=Abigail |last3=Milland |first3=Kristy |last4=Savage |first4=Saiph |last5=Callison-Burch |first5=Chris |last6=Bigham |first6=Jeffrey P. |title=Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems |chapter=A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk |date=2018-04-21 |chapter-url=https://doi.org/10.1145/3173574.3174023 |series=CHI '18 |location=New York, NY, USA |publisher=Association for Computing Machinery |pages=1–14 |doi=10.1145/3173574.3174023 |isbn=978-1-4503-5620-6|s2cid=5040507 |url=https://ora.ox.ac.uk/objects/uuid:9a415bec-5a69-447d-8c23-6b2aeef4de07 }}</ref>

The Pew Research Center and the International Labour Office published data indicating people made around $5.00 per hour in 2015.<ref name=":2" /><ref>Berg, J. (2015). Income security in the on-demand economy: Findings and policy lessons from a survey of crowdworkers. ''Comparative Labor Law and Policy Journal, 37,'' 543.</ref> A study focused on workers in the U.S. indicated average wages of at least $5.70 an hour,<ref>{{Cite journal |last1=Litman |first1=Leib |last2=Robinson |first2=Jonathan |last3=Rosen |first3=Zohn |last4=Rosenzweig |first4=Cheskie |last5=Waxman |first5=Joshua |last6=Bates |first6=Lisa M. |date=2020-02-21 |title=The persistence of pay inequality: The gender pay gap in an anonymous online labor market |journal=PLOS ONE |language=en |volume=15 |issue=2 |pages=e0229383 |doi=10.1371/journal.pone.0229383 |issn=1932-6203 |pmc=7034870 |pmid=32084233|bibcode=2020PLoSO..1529383L |doi-access=free }}</ref> and data from the CloudResearch study found average wages of about $6.61 per hour.<ref name=":3" /> Some evidence suggests that very active and experienced people can earn $20 per hour or more.<ref>{{Cite web |date=2019-11-18 |title=MTurk is the most ethical way to recruit crowd workers. |url=https://blog.turkerview.com/writer-who-never-learned-to-drive-works-for-uber/ |access-date=2023-01-24 |website=Blog {{!}} TurkerView |language=en}}</ref>


=== Fraud ===
=== Fraud ===
''The Nation'' magazine said in 2014 that some requesters had taken advantage of workers by having them do the tasks, then rejecting their submissions in order to avoid paying them.<ref>[http://www.thenation.com/article/178241/how-crowdworkers-became-ghosts-digital-machine?page=0,3 Moshe Z. Marvit, "How Crowdworkers Became the Ghosts in the Digital Machine," ''The Nation,'' February 24, 2014, screen 4]</ref>
''[[The Nation]]'' magazine reported in 2014 that some Requesters had taken advantage of Workers by having them do the tasks, then rejecting their submissions in order to avoid paying them.<ref>{{Cite magazine|url=https://www.thenation.com/article/how-crowdworkers-became-ghosts-digital-machine/|title=How Crowdworkers Became the Ghosts in the Digital Machine|first=Moshe Z.|last=Marvit|date=February 5, 2014|magazine=www.thenation.com}}</ref> Available data indicates that rejections are fairly rare. Workers report having a small minority of their HITs rejected, perhaps as low as 1%.<ref name=":3" />

In the [[Facebook–Cambridge Analytica data scandal]], Mechanical Turk was one of the means of covertly gathering private information for a massive database.<ref>{{cite news|url=https://www.nytimes.com/2018/04/10/magazine/cambridge-analytica-and-the-coming-data-bust.html?|author=New York Times|title=Cambridge Analytica and the Coming Data Bust|date=April 10, 2018|access-date=April 13, 2018|newspaper=The New York Times}}</ref> The system paid people a dollar or two to install a [[Facebook]]-connected app and answer personal questions. The survey task, as a work for hire, was not used for a demographic or psychological research project as it might have seemed. The purpose was instead to bait the worker to reveal personal information about the worker's identity that was not already collected by Facebook or Mechanical Turk.


=== Labor relations ===
=== Labor relations ===
Others have criticized that the marketplace does not have the ability for the workers to negotiate with the employers. In response to the growing criticisms of payment evasion and lack of representation, a group has developed a third party platform called Turkopticon which allows workers to give feedback on their employers allowing other users to avoid potentially shady jobs and to recommend superior employers.<ref name="newscientist">{{cite magazine |url= https://www.newscientist.com/article/mg21729036.200-crowdsourcing-grows-up-as-online-workers-unite.html#.VV9rTE9Viko |title=’Crowdsourcing grows up as online workers unite’ |author=Hal Hodson |date=February 7, 2013 |publisher=NewScientist |accessdate=May 21, 2015}}</ref><ref>{{cite web|url=https://turkopticon.ucsd.edu/|title=turkopticon's add-on}}</ref> Another platform called Dynamo was created to allow the workers to collect anonymously and organize campaigns to better their work environment, including the Guidelines for Academic Requesters and the Dear Jeff Bezos Campaign.<ref name="theguardian">{{cite magazine |url= https://www.theguardian.com/technology/2014/dec/03/amazon-mechanical-turk-workers-protest-jeff-bezos |title=’Amazon's Mechanical Turk workers protest: 'I am a human being, not an algorithm' |author=Mark Harris |date=December 3, 2014 |publisher=The Guardian |accessdate=October 6, 2015}}</ref><ref name="engadget">{{cite magazine |url= https://www.engadget.com/2014/12/03/amazon-mechanical-turk-workers-ask-for-respect/ |title=’Amazon's Mechanical Turk workers want to be treated like humans’ |author=Jon Fingas |date=December 3, 2014 |publisher=Engadget |accessdate=October 6, 2015}}</ref><ref name="theverge">{{cite magazine |url= https://www.theverge.com/2014/12/4/7331777/amazon-mechanical-turk-workforce-digital-labor |title=’Amazon's Mechanical Turkers want to be recognized as 'actual human beings' |author=James Vincent |date=December 4, 2014 |publisher=The Verge |accessdate=October 6, 2015}}</ref><ref name="fastcompany">{{cite magazine |url= http://www.fastcompany.com/3042081/what-does-a-union-look-like-in-the-gig-economy |title=’WHAT DOES A UNION LOOK LIKE IN THE GIG ECONOMY? |author=Sarah Kessler |date=February 19, 2015 |publisher=Fast Company |accessdate=October 6, 2015}}</ref>
Others have criticized that the marketplace does not allow workers to negotiate with employers. In response to criticisms of payment evasion and lack of representation, a group developed a third-party platform called Turkopticon which allows workers to give feedback on their employers. This allows workers to avoid potentially unscrupulous jobs and to recommend superior employers.<ref name="newscientist">{{cite magazine |url= https://www.newscientist.com/article/mg21729036.200-crowdsourcing-grows-up-as-online-workers-unite.html#.VV9rTE9Viko |title=Crowdsourcing grows up as online workers unite |author=Hal Hodson |date=February 7, 2013 |magazine=New Scientist |access-date=May 21, 2015}}</ref><ref>{{Cite web|url=https://turkopticon.ucsd.edu/|title=turkopticon.|website=turkopticon.ucsd.edu}}</ref> Another platform called Dynamo allows workers {{clarify|text=to collect|date=April 2023}} anonymously and organize campaigns to better their work environment, such as the Guidelines for Academic Requesters and the Dear Jeff Bezos Campaign.<ref name="theguardian">{{cite news |url= https://www.theguardian.com/technology/2014/dec/03/amazon-mechanical-turk-workers-protest-jeff-bezos |title='Amazon's Mechanical Turk workers protest: 'I am a human being, not an algorithm'{{'}} |author=Mark Harris |date=December 3, 2014 |newspaper=The Guardian |access-date=October 6, 2015}}</ref><ref name="engadget">{{cite magazine |url= https://www.engadget.com/2014/12/03/amazon-mechanical-turk-workers-ask-for-respect/ |title='Amazon's Mechanical Turk workers want to be treated like humans' |first=Jon |last=Fingas |date=December 3, 2014 |magazine=Engadget |access-date=October 6, 2015}}</ref><ref name="theverge">{{cite web |url= https://www.theverge.com/2014/12/4/7331777/amazon-mechanical-turk-workforce-digital-labor |title=Amazon's Mechanical Turkers want to be recognized as 'actual human beings' |author=James Vincent |date=December 4, 2014 |website=The Verge |access-date=October 6, 2015}}</ref><ref name="fastcompany">{{cite magazine |url= http://www.fastcompany.com/3042081/what-does-a-union-look-like-in-the-gig-economy |title=WHAT DOES A UNION LOOK LIKE IN THE GIG ECONOMY? |author=Sarah Kessler |date=February 19, 2015 |magazine=Fast Company |access-date=October 6, 2015}}</ref> Amazon made it harder for workers to enroll in Dynamo by closing the request account that provided workers with a required code for Dynamo membership. Workers created third-party plugins to identify higher paying tasks, but Amazon updated its website to prevent these plugins from working.<ref name="Atlantic Semuels 2018" /> Workers have complained that Amazon's payment system will on occasion stop working.<ref name="Atlantic Semuels 2018">{{cite web |last1=Semuels |first1=Alana |title=The Internet Is Enabling a New Kind of Poorly Paid Hell |url=https://www.theatlantic.com/business/archive/2018/01/amazon-mechanical-turk/551192/ |website=The Atlantic |access-date=25 April 2019 |date=23 January 2018}}</ref>


== Related systems ==
== Related systems ==
{{further|Crowdsourcing}}
{{further|Crowdsourcing}}
Amazon coined the term ''artificial artificial intelligence'' for processes outsourcing some parts of a computer program to humans, for those tasks carried out much faster by humans than computers. [[Jeff Bezos]] was responsible for the concept that led to Amazon's Mechanical Turk being developed to realize this process.<ref>{{cite news|url=http://www.economist.com/node/7001738?story_id=7001738|title=Artificial artificial intelligence|publisher=[[The Economist]] | date=2006-06-10}}</ref>


MTurk is comparable in some respects to the now discontinued [[Google Answers]] service. However, the Mechanical Turk is a more general [[Marketplace#Internet Markets|marketplace]] that can potentially help distribute any kind of work tasks all over the world. The [[Collaborative human interpreter|Collaborative Human Interpreter]] (CHI) by Philipp Lenssen also suggested using distributed human intelligence to help computer programs perform tasks that computers cannot do well. MTurk could be used as the execution engine for the CHI.
Mechanical Turk is comparable in some respects to the now discontinued [[Google Answers]] service. However, the Mechanical Turk is a more general [[Marketplace#Internet Markets|marketplace]] that can potentially help distribute any kind of work tasks all over the world. The [[Collaborative human interpreter|Collaborative Human Interpreter]] (CHI) by Philipp Lenssen also suggested using distributed human intelligence to help computer programs perform tasks that computers cannot do well. MTurk could be used as the execution engine for the CHI.{{citation needed|date=April 2019}}

In 2014 the Russian search giant [[Yandex]] launched a similar system called [[Toloka]] that is similar to the Mechanical Turk.<ref>{{Cite web | url=https://toloka.yandex.com/ |title = Yandex.Toloka}}</ref>


== See also ==
== See also ==
* [[CAPTCHA]], which challenges and verifies human work at a simple online task
* [[Citizen science]]
* [[Citizen science]]
* [[Microwork]]
* [[Microwork]]


== References ==
== References ==
{{reflist|30em}}
{{reflist}}


== Further reading ==
== Further reading ==
* [http://www.businessweek.com/the_thread/techbeat/archives/2005/11/amazons_mechani.html Business Week article on Mechanical Turk] by Rob Hof, November 4, 2005.
* [https://web.archive.org/web/20051107040557/http://www.businessweek.com/the_thread/techbeat/archives/2005/11/amazons_mechani.html Business Week article on Mechanical Turk] by Rob Hof, November 4, 2005.
* [https://www.wired.com/wired/archive/14.06/crowds.html Wired Magazine] story about "Crowdsourcing," June 2006.
* [https://www.wired.com/wired/archive/14.06/crowds.html Wired Magazine] story about "Crowdsourcing," June 2006.
* [http://www.salon.com/tech/feature/2006/07/24/turks/index1.html Salon.com article on Mechanical Turk] by Katharine Mieszkowski, July 24, 2006.
* [http://www.salon.com/tech/feature/2006/07/24/turks/index1.html Salon.com article on Mechanical Turk] by Katharine Mieszkowski, July 24, 2006.
Line 167: Line 140:


== External links ==
== External links ==
* {{Official website|www.mturk.com}}
* {{Official website}}
* [http://mturkpublic.s3.amazonaws.com/docs/MTURK_BP.pdf Requester Best Practices Guide], Updated February 2015.
* [http://mturkpublic.s3.amazonaws.com/docs/MTURK_BP.pdf Requester Best Practices Guide], Updated February 2015.
* {{cite web |url=http://ir.ischool.utexas.edu/crowd/#mturk |title=Amazon Mechanical Turk |work=Crowdsourcing News, Events, and Resources |editor=Matt Lease |via= [[University of Texas at Austin School of Information]] |location=USA}}
* {{cite web |url=http://ir.ischool.utexas.edu/crowd/#mturk |title=Amazon Mechanical Turk |work=Crowdsourcing News, Events, and Resources |editor=Matt Lease |via= [[University of Texas at Austin School of Information]] |location=US}}


{{Cloud computing}}
{{Cloud computing}}
{{Amazon}}
{{Amazon}}


[[Category:Amazon (company)]]
<!--Interwikies-->
[[Category:Internet properties established in 2005]]

[[Category:Amazon (company)|Mechanical Turk]]
[[Category:Crowdsourcing]]
[[Category:Crowdsourcing]]
[[Category:Human-based computation]]
[[Category:Human-based computation]]

Latest revision as of 01:04, 30 December 2024

Amazon Mechanical Turk (MTurk) is a crowdsourcing website with which businesses can hire remotely located "crowdworkers" to perform discrete on-demand tasks that computers are currently unable to do as economically. It is operated under Amazon Web Services, and is owned by Amazon.[1] Employers, known as requesters, post jobs known as Human Intelligence Tasks (HITs), such as identifying specific content in an image or video, writing product descriptions, or answering survey questions. Workers, colloquially known as Turkers or crowdworkers, browse among existing jobs and complete them in exchange for a fee set by the requester. To place jobs, requesters use an open application programming interface (API), or the more limited MTurk Requester site.[2] As of April 2019, requesters could register from 49 approved countries.[3]

History

[edit]

The service was conceived by Venky Harinarayan in a U.S. patent disclosure in 2001.[4] Amazon coined the term artificial artificial intelligence for processes that outsource some parts of a computer program to humans, for those tasks carried out much faster by humans than computers. It is claimed[by whom?] that Jeff Bezos was responsible for proposing the development of Amazon's Mechanical Turk to realize this process.[5]

The name Mechanical Turk was inspired by "The Turk", an 18th-century chess-playing automaton made by Wolfgang von Kempelen that toured Europe, and beat both Napoleon Bonaparte and Benjamin Franklin. It was later revealed that this "machine" was not an automaton, but a human chess master hidden in the cabinet beneath the board and controlling the movements of a humanoid dummy. Analogously, the Mechanical Turk online service uses remote human labor hidden behind a computer interface to help employers perform tasks that are not possible using a true machine.

MTurk launched publicly on November 2, 2005. Its user base grew quickly. In early- to mid-November 2005, there were tens of thousands of jobs, all uploaded to the system by Amazon itself for some of its internal tasks that required human intelligence. HIT types expanded to include transcribing, rating, image tagging, surveys, and writing.

In March 2007, there were reportedly more than 100,000 workers in over 100 countries.[6] This increased to over 500,000 registered workers from over 190 countries in January 2011.[7] That year, Techlist published an interactive map pinpointing the locations of 50,000 of their MTurk workers around the world.[8] By 2018, research demonstrated that while over 100,000 workers were available on the platform at any time, only around 2,000 were actively working.[9]

Overview

[edit]

A user of Mechanical Turk can be either a "Worker" (contractor) or a "Requester" (employer). Workers have access to a dashboard that displays three sections: total earnings, HIT status, and HIT totals. Workers set their own hours and are not under any obligation to accept any particular task.

Amazon classifies Workers as contractors rather than employees and does not pay payroll taxes. Classifying Workers as contractors allows Amazon to avoid things like minimum wage, overtime, and workers compensation—this is a common practice among "gig economy" platforms. Workers are legally required to report their income as self-employment income.

In 2013, the average wage for the multiple microtasks assigned, if performed quickly, was about one dollar an hour, with each task averaging a few cents.[10] However, calculating people's average hourly earnings on a microtask site is extremely difficult and several sources of data show average hourly earnings in the $5–$9 per hour[11][12][13][14] range among a substantial number of Workers, while the most experienced, active, and proficient workers may earn over $20 per hour.[15]

Workers can have a postal address anywhere in the world. Payment for completing tasks can be redeemed on Amazon.com via gift certificate (gift certificates are the only payment option available to international workers, apart from India) or can be transferred to a Worker's U.S. bank account.

Requesters can ask that Workers fulfill qualifications before engaging in a task, and they can establish a test designed to verify the qualification. They can also accept or reject the result sent by the Worker, which affects the Worker's reputation. As of April 2019, Requesters paid Amazon a minimum 20% commission on the price of successfully completed jobs, with increased amounts for additional services[clarification needed].[6] Requesters can use the Amazon Mechanical Turk API to programmatically integrate the results of the work directly into their business processes and systems. When employers set up a job, they must specify

  • how much are they paying for each HIT accomplished,
  • how many workers they want to work on each HIT,
  • the maximum time a worker has to work on a single task,
  • how much time the workers have to complete the work,

as well as the specific details about the job they want to be completed.

Location of Turkers

[edit]

Workers have been primarily located in the United States since the platform's inception[16] with demographics generally similar to the overall Internet population in the U.S.[17] Within the U.S. workers are fairly evenly spread across states, proportional to each state’s share of the U.S. population.[18] As of 2019, between 15 and 30 thousand people in the U.S. complete at least one HIT each month and about 4,500 new people join MTurk each month.[19]

Cash payments for Indian workers were introduced in 2010, which updated the demographics of workers, who however remained primarily within the United States.[20] A website showing worker demographics in May 2015 showed that 80% of workers were located in the United States, with the remaining 20% located elsewhere in the world, most of whom were in India.[21] In May 2019, approximately 60% were in the U.S., 40% elsewhere (approximately 30% in India).[22] In early 2023 about 90% of workers were from the U.S. and about half of the remainder from India.[23]

Uses

[edit]

Human-subject research

[edit]

Since 2010, numerous researchers have explored the viability of Mechanical Turk to recruit subjects for social science experiments. Researchers have generally found that while samples of respondents obtained through Mechanical Turk do not perfectly match all relevant characteristics of the U.S. population, they are also not wildly misrepresentative.[24][25] As a result, thousands of papers that rely on data collected from Mechanical Turk workers are published each year, including hundreds in top ranked academic journals.

A challenge with using MTurk for human-subject research has been maintaining data quality. A study published in 2021 found that the types of quality control approaches used by researchers (such as checking for bots, VPN users, or workers willing to submit dishonest responses) can meaningfully influence survey results. They demonstrated this via impact on three common behavioral/mental healthcare screening tools.[26] Even though managing data quality requires work from researchers, there is a large body of research showing how to gather high quality data from MTurk.[27][28] The cost of using MTurk is considerably lower than many other means of conducting surveys, so many researchers continue to use it.

The general consensus among researchers is that the service works best for recruiting a diverse sample; it is less successful with studies that require more precisely defined populations or that require a representative sample of the population as a whole.[29] Many papers have been published on the demographics of the MTurk population.[18][30][31] MTurk workers tend to be younger, more educated, more liberal, and slightly less wealthy than the U.S. population overall.[32]

Machine Learning

[edit]

Supervised Machine Learning algorithms require large amounts of human-annotated data to be trained successfully. Machine learning researchers have hired Workers through Mechanical Turk to produce datasets such as SQuAD, a question answering dataset.[33]

Missing persons searches

[edit]

Since 2007, the service has been used to search for prominent missing individuals. This use was first suggested during the search for James Kim, but his body was found before any technical progress was made. That summer, computer scientist Jim Gray disappeared on his yacht and Amazon's Werner Vogels, a personal friend, made arrangements for DigitalGlobe, which provides satellite data for Google Maps and Google Earth, to put recent photography of the Farallon Islands on Mechanical Turk. A front-page story on Digg attracted 12,000 searchers who worked with imaging professionals on the same data. The search was unsuccessful.[34]

In September 2007, a similar arrangement was repeated in the search for aviator Steve Fossett. Satellite data was divided into 85-square-metre (910 sq ft) sections, and Mechanical Turk users were asked to flag images with "foreign objects" that might be a crash site or other evidence that should be examined more closely.[35] This search was also unsuccessful. The satellite imagery was mostly within a 50-mile radius,[36] but the crash site was eventually found by hikers about a year later, 65 miles away.[37]

Artistic works

[edit]

MTurk has also been used as a tool for artistic creation. One of the first artists to work with Mechanical Turk was xtine burrough, with The Mechanical Olympics (2008),[38][39] Endless Om (2015), and Mediations on Digital Labor (2015).[40] Another work was artist Aaron Koblin's Ten Thousand Cents (2008).[further explanation needed]

Third-party programming

[edit]

Programmers have developed browser extensions and scripts designed to simplify the process of completing jobs. Amazon has stated that they disapprove of scripts that completely automate the process and preclude the human element. This is because of the concern that the task completion process—e.g. answering a survey—could be gamed with random responses, and the resultant collected data could be worthless.[41] Accounts using so-called automated bots have been banned. There are services that extend the capabilities to MTurk.[clarification needed]

API

[edit]

Amazon makes available an application programming interface (API) for the MTurk system. The MTurk API lets a programmer submit jobs, retrieve completed work, and approve or reject that work.[42] In 2017, Amazon launched support for AWS Software Development Kits (SDK), allowing for nine new SDKs available to MTurk Users.[importance?] MTurk is accessible via API from the following languages: Python, JavaScript, Java, .NET, Go, Ruby, PHP, or C++.[43] Web sites and web services can use the API to integrate MTurk work into other web applications, providing users with alternatives to the interface Amazon has built for these functions.

Use case examples

[edit]

Processing photos / videos

[edit]

Amazon Mechanical Turk provides a platform for processing images, a task well-suited to human intelligence. Requesters have created tasks that ask workers to label objects found in an image, select the most relevant picture in a group of pictures, screen inappropriate content, classify objects in satellite images, or digitize text from images such as scanned forms filled out by hand.[44]

Data cleaning / verification

[edit]

Companies with large online catalogues use Mechanical Turk to identify duplicates and verify details of item entries. For example: removing duplicates in yellow pages directory listings, checking restaurant details (e.g. phone number and hours), and finding contact information from web pages (e.g. author name and email).[10][44]

Information collection

[edit]

Diversification and scale of personnel of Mechanical Turk allow collecting information at a large scale, which would be difficult outside of a crowd platform. Mechanical Turk allows Requesters to amass a large number of responses to various types of surveys, from basic demographics to academic research. Other uses include writing comments, descriptions, and blog entries to websites and searching data elements or specific fields in large government and legal documents.[44]

Data processing

[edit]

Companies use Mechanical Turk's crowd labor to understand and respond to different types of data. Common uses include editing and transcription of podcasts, translation, and matching search engine results.[10][44]

Research validity

[edit]

The validity of research conducted with the Mechanical Turk worker pool has long been debated among experts.[45] This is largely because questions of validity[46] are complex: they involve not only questions of whether the research methods were appropriate and whether the study was well-executed, but also questions about the goal of the project, how the researchers used MTurk, who was sampled, and what conclusions were drawn.

Most experts agree that MTurk is better suited for some types of research than others. MTurk appears well-suited for questions that seek to understand whether two or more things are related to each other (called correlational research; e.g., are happy people more healthy?) and questions that attempt to show one thing causes another thing (experimental research; e.g., being happy makes people more healthy). Fortunately, these categories capture most of the research conducted by behavioral scientists, and most correlational and experimental findings found in nationally representative samples replicate on MTurk.[47]

The type of research that is not well-suited for MTurk is often called "descriptive research." Descriptive research seeks to describe how or what people think, feel, or do; one example is public opinion polling. MTurk is not well-suited to such research because it does not select a representative sample of the general population. Instead, MTurk is a nonprobability,[jargon] convenience sample. Descriptive research is best conducted with a probability-based, representative sample of the population researchers want to understand. When compared to the general population, people on MTurk are younger, more highly educated, more liberal, and less religious.[48][18][31]

Labor issues

[edit]

Mechanical Turk has been criticized by journalists and activists for its interactions with and use of labor. Computer scientist Jaron Lanier noted how the design of Mechanical Turk "allows you to think of the people as software components" in a way that conjures "a sense of magic, as if you can just pluck results out of the cloud at an incredibly low cost".[49] A similar point is made in the book Ghostwork by Mary L. Gray and Siddharth Suri.[50][importance?]

Critics of MTurk argue that workers are forced onto the site by precarious economic conditions and then exploited by requesters with low wages and a lack of power when disputes occur. Journalist Alana Semuels’s article "The Internet Is Enabling a New Kind of Poorly Paid Hell" in The Atlantic is typical of such criticisms of MTurk.[51]

Some[who?] academic papers have obtained findings that support or serve as the basis for such common criticisms,[52] but others contradict them.[53] A recent academic commentary argued that study participants on sites like MTurk should be clearly warned about the circumstances in which they might later be denied payment as a matter of ethics,[54] even though such statements may not reduce the rate of careless responding.[55]

A paper published by a team at CloudResearch[14] shows that only about 7% of people on MTurk view completing HITs as something akin to a full-time job. Most people report that MTurk is a way to earn money during their leisure time or as a side gig. In 2019, the typical worker spent five to eight hours per week and earned around $7 per hour. The sampled workers did not report rampant[clarification needed] mistreatment at the hands of requesters; they reported trusting requesters more than employers outside of MTurk. Similar findings were presented in a review of MTurk by the Fair Crowd Work organization, a collective of crowd workers and unions.[56][unreliable source?]

Monetary compensation

[edit]

The minimum payment that Amazon allows for a task is one cent. Because tasks are typically simple and repetitive the majority of tasks pay only a few cents,[57] but there are also well-paying tasks on the site.

Many criticisms of MTurk stem from the fact that a majority of tasks offer low wages. In addition, workers are considered independent contractors rather than employees. Independent contractors are not protected by the Fair Labor Standards Act or other legislation that protects workers’ rights.[United States-centric] Workers on MTurk must compete with others for good HIT opportunities as well as spend time searching for tasks and other actions that they are not compensated for.

The low payment offered for many tasks has fueled criticism of Mechanical Turk for exploiting and not compensating workers for the true value of the task they complete.[58] One study of 3.8 million tasks completed by 2,767 workers showed that "workers earned a median hourly wage of about $2 an hour" with 4% of workers earning more than $7.25 per hour.[59]

The Pew Research Center and the International Labour Office published data indicating people made around $5.00 per hour in 2015.[12][60] A study focused on workers in the U.S. indicated average wages of at least $5.70 an hour,[61] and data from the CloudResearch study found average wages of about $6.61 per hour.[14] Some evidence suggests that very active and experienced people can earn $20 per hour or more.[62]

Fraud

[edit]

The Nation magazine reported in 2014 that some Requesters had taken advantage of Workers by having them do the tasks, then rejecting their submissions in order to avoid paying them.[63] Available data indicates that rejections are fairly rare. Workers report having a small minority of their HITs rejected, perhaps as low as 1%.[14]

In the Facebook–Cambridge Analytica data scandal, Mechanical Turk was one of the means of covertly gathering private information for a massive database.[64] The system paid people a dollar or two to install a Facebook-connected app and answer personal questions. The survey task, as a work for hire, was not used for a demographic or psychological research project as it might have seemed. The purpose was instead to bait the worker to reveal personal information about the worker's identity that was not already collected by Facebook or Mechanical Turk.

Labor relations

[edit]

Others have criticized that the marketplace does not allow workers to negotiate with employers. In response to criticisms of payment evasion and lack of representation, a group developed a third-party platform called Turkopticon which allows workers to give feedback on their employers. This allows workers to avoid potentially unscrupulous jobs and to recommend superior employers.[65][66] Another platform called Dynamo allows workers to collect[clarification needed] anonymously and organize campaigns to better their work environment, such as the Guidelines for Academic Requesters and the Dear Jeff Bezos Campaign.[67][68][69][70] Amazon made it harder for workers to enroll in Dynamo by closing the request account that provided workers with a required code for Dynamo membership. Workers created third-party plugins to identify higher paying tasks, but Amazon updated its website to prevent these plugins from working.[71] Workers have complained that Amazon's payment system will on occasion stop working.[71]

[edit]

Mechanical Turk is comparable in some respects to the now discontinued Google Answers service. However, the Mechanical Turk is a more general marketplace that can potentially help distribute any kind of work tasks all over the world. The Collaborative Human Interpreter (CHI) by Philipp Lenssen also suggested using distributed human intelligence to help computer programs perform tasks that computers cannot do well. MTurk could be used as the execution engine for the CHI.[citation needed]

In 2014 the Russian search giant Yandex launched a similar system called Toloka that is similar to the Mechanical Turk.[72]

See also

[edit]

References

[edit]
  1. ^ "Amazon Mechanical Turk, FAQ page". Retrieved 14 April 2017.
  2. ^ "Overview | Requester | Amazon Mechanical Turk". Requester.mturk.com. Retrieved 2011-11-28.
  3. ^ "Amazon Mechanical Turk". www.mturk.com.
  4. ^ Multiple sources:
  5. ^ "Artificial artificial intelligence". The Economist. 2006-06-10.
  6. ^ a b "Mturk pricing". AWS. Amazon. 2019. Retrieved 16 April 2019.
  7. ^ "AWS Developer Forums". Retrieved 14 November 2012.
  8. ^ Tamir, Dahn. "50000 Worldwide Mechanical Turk Workers". techlist. Retrieved September 17, 2014.
  9. ^ Djellel, Difallah; Filatova, Elena; Ipeirotis, Panos (2018). "Demographics and Dynamics of Mechanical Turk Workers". Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining (PDF). pp. 135–143. doi:10.1145/3159652.3159661. ISBN 9781450355810. S2CID 22339115.
  10. ^ a b c "Amazon Mechanical Turk: The Digital Sweatshop" Ellen Cushing Utne Reader January–February 2013
  11. ^ Berg, Janine (2015–2016). "Income Security in the On-Demand Economy: Findings and Policy Lessons from a Survey of Crowdworkers". Comparative Labor Law & Policy Journal. 37: 543.
  12. ^ a b Geiger, Abigail (2016-07-11). "Research in the Crowdsourcing Age, a Case Study". Pew Research Center: Internet, Science & Tech. Retrieved 2023-01-09.
  13. ^ "Amazon Mechanical Turk -Fair Crowd Work". Retrieved 2023-01-09.
  14. ^ a b c d Moss, Aaron J; Rosenzweig, Cheskie; Robinson, Jonathan; Jaffe, Shalom Noach; Litman, Leib (2020-04-28). "Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages". doi:10.31234/osf.io/jbc9d. S2CID 236840556. {{cite journal}}: Cite journal requires |journal= (help)
  15. ^ "MTurk is the most ethical way to recruit crowd workers". Blog | TurkerView. 2019-11-18. Retrieved 2023-01-09.
  16. ^ Panos Ipeirotis (March 19, 2008). "Mechanical Turk: The Demographics". New York University. Retrieved 2009-07-30.
  17. ^ Panos Ipeirotis (March 16, 2009). "Turker Demographics vs Internet Demographics". New York University. Retrieved 2009-07-30.
  18. ^ a b c Litman, Leib (2020). Conducting online research on Amazon Mechanical Turk and beyond. Jonathan Robinson (1st ed.). Los Angeles. ISBN 978-1-5063-9111-3. OCLC 1180179545.{{cite book}}: CS1 maint: location missing publisher (link)
  19. ^ Robinson, Jonathan; Rosenzweig, Cheskie; Moss, Aaron J.; Litman, Leib (2019-12-16). Sudzina, Frantisek (ed.). "Tapped out or barely tapped? Recommendations for how to harness the vast and largely unused potential of the Mechanical Turk participant pool". PLOS ONE. 14 (12): e0226394. Bibcode:2019PLoSO..1426394R. doi:10.1371/journal.pone.0226394. ISSN 1932-6203. PMC 6913990. PMID 31841534.
  20. ^ Panos Ipeirotis (March 9, 2010). "The New Demographics of Mechanical Turk". New York University. Retrieved 2014-03-24.
  21. ^ "MTurk Tracker". demographics.mturk-tracker.com. Retrieved 1 October 2015.
  22. ^ "MTurk Tracker". demographics.mturk-tracker.com. Retrieved 2 May 2019.
  23. ^ "MTurk Tracker". demographics.mturk-tracker.com. Retrieved 17 April 2023.
  24. ^ Casey, Logan; Chandler, Jesse; Levine, Adam; Proctor, Andrew; Sytolovich, Dara (2017). "Intertemporal Differences Among MTurk Workers: Time-Based Sample Variations and Implications for Online Data Collection". SAGE Open. 7 (2): 215824401771277. doi:10.1177/2158244017712774.
  25. ^ Levay, Kevin; Freese, Jeremy; Druckman, James (2016). "The Demographic and Political Composition of Mechanical Turk Samples". SAGE Open. 6: 215824401663643. doi:10.1177/2158244016636433.
  26. ^ Agley, Jon; Xiao, Yunyu; Nolan, Rachael; Golzarri-Arroyo, Lilian (2021). "Quality control questions on Amazon's Mechanical Turk (MTurk): A randomized trial of impact on the USAUDIT, PHQ-9, and GAD-7". Behavior Research Methods. 54 (2): 885–897. doi:10.3758/s13428-021-01665-8. ISSN 1554-3528. PMC 8344397. PMID 34357539.
  27. ^ Hauser, David; Paolacci, Gabriele; Chandler, Jesse J. (2018-09-01). "Common Concerns with MTurk as a Participant Pool: Evidence and Solutions". doi:10.31234/osf.io/uq45c. S2CID 240258666. {{cite journal}}: Cite journal requires |journal= (help)
  28. ^ Saravanos, Antonios; Zervoudakis, Stavros; Zheng, Dongnanzi; Stott, Neil; Hawryluk, Bohdan; Delfino, Donatella (2021). Stephanidis, Constantine; Soares, Marcelo M.; Rosenzweig, Elizabeth; Marcus, Aaron; Yamamoto, Sakae; Mori, Hirohiko; Rau, Pei-Luen Patrick; Meiselwitz, Gabriele; Fang, Xiaowen (eds.). "The Hidden Cost of Using Amazon Mechanical Turk for Research". HCI International 2021 - Late Breaking Papers: Design and User Experience. Cham: Springer International Publishing: 147–164. doi:10.1007/978-3-030-90238-4_12. ISBN 978-3-030-90238-4.
  29. ^ Chandler, Jesse.; Shapiro, Danielle (2016). "Conducting Clinical Research Using Crowdsourced Convenience Samples". Annual Review of Clinical Psychology. 12: 53–81. doi:10.1146/annurev-clinpsy-021815-093623. PMID 26772208.
  30. ^ Huff, Connor; Tingley, Dustin (2015-07-01). ""Who are these people?" Evaluating the demographic characteristics and political preferences of MTurk survey respondents". Research & Politics. 2 (3): 205316801560464. doi:10.1177/2053168015604648. ISSN 2053-1680. S2CID 7749084.
  31. ^ a b Clifford, Scott; Jewell, Ryan M; Waggoner, Philip D (2015-10-01). "Are samples drawn from Mechanical Turk valid for research on political ideology?". Research & Politics. 2 (4): 205316801562207. doi:10.1177/2053168015622072. ISSN 2053-1680. S2CID 146591698.
  32. ^ Chandler, Jesse; Rosenzweig, Cheskie; Moss, Aaron J.; Robinson, Jonathan; Litman, Leib (October 2019). "Online panels in social science research: Expanding sampling methods beyond Mechanical Turk". Behavior Research Methods. 51 (5): 2022–2038. doi:10.3758/s13428-019-01273-7. ISSN 1554-3528. PMC 6797699. PMID 31512174.
  33. ^ Rajpurkar, Pranav; Zhang, Jian; Lopyrev, Konstantin; Liang, Percy (2016). "SQuAD: 100,000+ Questions for Machine Comprehension of Text". arXiv:1606.05250 [cs.CL].
  34. ^ Steve Silberman (July 24, 2007). "Inside the High-Tech Search for a Silicon Valley Legend". Wired magazine. Retrieved 2007-09-16.
  35. ^ "AVweb Invites You to Join the Search for Steve Fossett". Avweb.com. 8 September 2007. Retrieved 2011-11-28.
  36. ^ "Official Mechanical Turk Steve Fossett Results". 2007-09-24. Retrieved 2012-08-14.
  37. ^ Jim Christie (October 1, 2008). "Hikers find Steve Fossett's ID, belongings". Reuters. Archived from the original on 20 December 2008. Retrieved 2008-11-27.
  38. ^ "Let's Get Physical". 5 August 2008.
  39. ^ "Mechanical Games, online sports video for turkers | Neural". 29 October 2010.
  40. ^ "Jail Benches and Amazon.com at SanTana's Grand Central Art Center | OC Weekly". Archived from the original on 2015-09-06. Retrieved 2019-04-16.
  41. ^ "Amazon Web Services Blog: Amazon Mechanical Turk Status Update". Aws.typepad.com. 2005-12-06. Retrieved 2011-11-28.
  42. ^ "Documentation Archive : Amazon Web Services". Developer.amazonwebservices.com. Archived from the original on 2009-04-10. Retrieved 2011-11-28.
  43. ^ "Amazon Mechanical Turk API Reference". Developer.amazonwebservices.com.
  44. ^ a b c d "Inside Amazon's clickworker platform: How half a million people are being paid pennies to train AI". TechRepublic. 16 December 2016.
  45. ^ Landers, R. N.; Behrend, T. S. (2015). "Can I Use Mechanical Turk (MTurk) for a Research Study?". Industrial and Organizational Psychology. 8 (2).
  46. ^ "External Validity - Generalizing Results in Research". explorable.com.
  47. ^ Coppock, Alexander; Leeper, Thomas J.; Mullinix, Kevin J. (2018-12-04). "Generalizability of heterogeneous treatment effect estimates across samples". Proceedings of the National Academy of Sciences. 115 (49): 12441–12446. Bibcode:2018PNAS..11512441C. doi:10.1073/pnas.1808083115. ISSN 0027-8424. PMC 6298071. PMID 30446611.
  48. ^ Chandler, Jesse; Rosenzweig, Cheskie; Moss, Aaron J.; Robinson, Jonathan; Litman, Leib (2019-10-01). "Online panels in social science research: Expanding sampling methods beyond Mechanical Turk". Behavior Research Methods. 51 (5): 2022–2038. doi:10.3758/s13428-019-01273-7. ISSN 1554-3528. PMC 6797699. PMID 31512174.
  49. ^ Jaron Lanier (2013). Who Owns the Future?. Simon and Schuster. ISBN 978-1-4516-5497-4.
  50. ^ "Ghost Work". Ghost Work. Retrieved 2023-01-24.
  51. ^ Semuels, Alana (2018-01-23). "The Internet Is Enabling a New Kind of Poorly Paid Hell". The Atlantic. Retrieved 2023-01-24.
  52. ^ Fort, K.; Adda, G.; Cohen, K.B. (2011). "Amazon Mechanical Turk: Gold mine or coal mine?". Computational Linguistics. 37 (2): 413–420. doi:10.1162/COLI_a_00057. S2CID 1051130.
  53. ^ Horton, John J. (2011-04-01). "The condition of the Turking class: Are online employers fair and honest?". Economics Letters. 111 (1): 10–12. arXiv:1001.1172. doi:10.1016/j.econlet.2010.12.007. ISSN 0165-1765. S2CID 37577313.
  54. ^ Agley, Jon; Mumaw, Casey (2024-05-29). "Warning Crowdsourced Study Participants About Possible Consequences for Inattentive Participation Relates to Informed Consent, Regardless of Effects on Data Quality". Health Behavior Research. 7 (2). doi:10.4148/2572-1836.1236. ISSN 2572-1836.
  55. ^ Brühlmann, Florian; Memeti, Zgjim; Aeschbach, Lena F.; Perrig, Sebastian A. C.; Opwis, Klaus (2024-01-18). "The effectiveness of warning statements in reducing careless responding in crowdsourced online surveys". Behavior Research Methods. 56 (6): 5862–5875. doi:10.3758/s13428-023-02321-z. ISSN 1554-3528. PMC 11335820. PMID 38238528.
  56. ^ "Amazon Mechanical Turk -Fair Crowd Work". Retrieved 2023-01-24.
  57. ^ Ipeirotis, P. G. (2010). Analyzing the amazon mechanical turk marketplace. XRDS: Crossroads, The ACM magazine for students, 17(2), 16-21.
  58. ^ Schmidt, Florian Alexander (2013). "The Good, the Bad and the Ugly: Why Crowdsourcing Needs Ethics". 2013 International Conference on Cloud and Green Computing. pp. 531–535. doi:10.1109/CGC.2013.89. ISBN 978-0-7695-5114-2. S2CID 18798641.
  59. ^ Hara, Kotaro; Adams, Abigail; Milland, Kristy; Savage, Saiph; Callison-Burch, Chris; Bigham, Jeffrey P. (2018-04-21). "A Data-Driven Analysis of Workers' Earnings on Amazon Mechanical Turk". Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. CHI '18. New York, NY, USA: Association for Computing Machinery. pp. 1–14. doi:10.1145/3173574.3174023. ISBN 978-1-4503-5620-6. S2CID 5040507.
  60. ^ Berg, J. (2015). Income security in the on-demand economy: Findings and policy lessons from a survey of crowdworkers. Comparative Labor Law and Policy Journal, 37, 543.
  61. ^ Litman, Leib; Robinson, Jonathan; Rosen, Zohn; Rosenzweig, Cheskie; Waxman, Joshua; Bates, Lisa M. (2020-02-21). "The persistence of pay inequality: The gender pay gap in an anonymous online labor market". PLOS ONE. 15 (2): e0229383. Bibcode:2020PLoSO..1529383L. doi:10.1371/journal.pone.0229383. ISSN 1932-6203. PMC 7034870. PMID 32084233.
  62. ^ "MTurk is the most ethical way to recruit crowd workers". Blog | TurkerView. 2019-11-18. Retrieved 2023-01-24.
  63. ^ Marvit, Moshe Z. (February 5, 2014). "How Crowdworkers Became the Ghosts in the Digital Machine". www.thenation.com.
  64. ^ New York Times (April 10, 2018). "Cambridge Analytica and the Coming Data Bust". The New York Times. Retrieved April 13, 2018.
  65. ^ Hal Hodson (February 7, 2013). "Crowdsourcing grows up as online workers unite". New Scientist. Retrieved May 21, 2015.
  66. ^ "turkopticon". turkopticon.ucsd.edu.
  67. ^ Mark Harris (December 3, 2014). "'Amazon's Mechanical Turk workers protest: 'I am a human being, not an algorithm''". The Guardian. Retrieved October 6, 2015.
  68. ^ Fingas, Jon (December 3, 2014). "'Amazon's Mechanical Turk workers want to be treated like humans'". Engadget. Retrieved October 6, 2015.
  69. ^ James Vincent (December 4, 2014). "Amazon's Mechanical Turkers want to be recognized as 'actual human beings'". The Verge. Retrieved October 6, 2015.
  70. ^ Sarah Kessler (February 19, 2015). "WHAT DOES A UNION LOOK LIKE IN THE GIG ECONOMY?". Fast Company. Retrieved October 6, 2015.
  71. ^ a b Semuels, Alana (23 January 2018). "The Internet Is Enabling a New Kind of Poorly Paid Hell". The Atlantic. Retrieved 25 April 2019.
  72. ^ "Yandex.Toloka".

Further reading

[edit]
[edit]