Research Article
BibTex RIS Cite

CROWDSOURCING CS AS A DATA COLLECTION PLATFORM: A LITERATURE REVIEW AND ASSESSMENT BASED ON THE CASE OF MECHANICAL TURK

Year 2017, ICMEB17 Özel Sayısı, 975 - 982, 01.12.2017

Abstract

Positivist academic studies are, generally, based on testing of theories and hypotheses; hence the testing process requires the existence of participants and their characteristics. Towards this end, in the past, students were most frequently used through convenience sampling method as participants in academic research, but in recent years, Internet has become popular as a source for supplying participants and a data collection method. In this context, the popularity of Crowdsourcing CS concept, which is coined by Jeff Howe in 2006, is increasing in recent years. Together with increasing popularity, several sampling, statistical and other scientific problems have been subjected to various debates across academic circles. In simple terms, crowdsourcing may be defined as the collection of data or information sources required within the context of a particular study from a big number of people or crowd. In the context of social sciences, crowdsourcing may be described as the collection of the needed data from various Internet data collection platforms such as Amazon Mechanical Turk, Survey Monkey, TryMyUI, Facebook, Qualtrics, etc. with huge number of participants or workers. But data collected through these platforms may pose many problems such as sampling issues, reliability and validity problems, representativeness, statistical analysis and the generalizability of findings. The purpose of this study is, through a critical literature review technique, to examine and discuss the pros and cons of data collected through CS approach and the appropriateness of this type of data in academic research purpose and its implications for the outcomes of scientific endeavor. In the context of this research, discussions and assessments are based on the data collection platform of Amazon’s Mechanical Turk platform which is discussed in the literature frequently during the last several years

References

  • Adda, G., & Cohen, K.B., (2017). Amazon mechanical Turk: Gold mine or coal mine?. Computational Lingustics, 37(2), 2-10.
  • Bakic, T.V., Kostic, S.C., & Neskovic, E. (2014). Applications of crowdsourcing in marketing. Revista Romana de Statistica – Supplement, 7, 7-16.
  • Bergman, M. E., & Jean, V.A.(2016). Where have all the “workers” gone? A critical analysis of the unrepresentativeness of our samples relative to the labor market in the industrial–organizational psychology literature. Industrial andOrganizational Psychology: Perspectives on Science and Practice, 9, 84–113.
  • Brabham, D.C. (2008). Crowdsourcing as a model for problem solving: An introduction and cases. Convergence: The International Journal of Research into New Media Technologies, 14(1), 75-90.
  • Buhrmester, M., Kwang, T., & Gosling, S.D. (2011). Amazon’s mechanical Turk: A new source of inexpensive, yet high-quality, data?. Perspective on Psychological Science, 6(1), 3-9.
  • Chandler, J., Mueller, P., & Paolacci, G. (2014). Nonnaïveté among Amazon mechanical Turk workers: Consequences and solutions for behavioral researchers. Behavior Research Methods, 46, 112–130.
  • Christenson, D.P., & Glick, D.M. (2013). Crowdsourcing panel studies and real time experiments in MTurk. The Political Methodologist, 20, 27-33.
  • Doan, A., Ramakrishnan, R., & Halevy, A.Y. (2011). Crowdsourcing systems on the world-wide web. Communications of the ACM, 54(4), 86-96.
  • Edgar, J., Murphy, J., & Keating, M. (2016). Comparing traditional and crowdsourcing methods for pretesting survey questions. SAGE Open, October-December, 1-16.
  • Estelles-Arolas, E. & Ladron-De-Guevara, F. (2012). Towards an integrated crowdsourcing definition. Journal of Information Science, 38(2), 189–200.
  • Goodman, J.K. & Paolacci, G. (2017). Crowdsourcing consumer research. Journal of Consumer Research, 44, February, 196-208.
  • Goodrich, R. (2013). What is crowdsourcing. Business News Daily, February 26, 1.
  • Google (2017). Crowdsourcing, Erişim Tarihi: 02.12.2017.
  • Harms, P. D., & DeSimone, J. A. (2015). Caution MTurk workers ahead—Fines doubled. Industrial and Organizational Psychology: Perspectives on Science and Practice, 8(2), 183–190.
  • Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world?. Behavioral and Brain Sciences, 33(2/3), 61–83.
  • Hitlin, P. (2016). Research in the crowdsourcing age, A Case Study. PewResearchCenter, July, 1-36.
  • Hook, L. (2016). The humans behind mechanical Turk’s artificial intelligence. Financial Times, October 26, 2016.
  • Hossain, M., & Kauranen, I. (2015). Crowdsourcing: a comprehensive literature review. Strategic Outsourcing: An International Journal, 8(1), 2-22, www.emeraldinsight.com/1753-8297.htm Howe, J. (2006). The http://www.wired.com/wired/archive/14.06/crowds_pr.html. Rise of
  • Crowdsourcing. Wired Magazine,14/6
  • Huff, C., & Tingley, D. (2015). Who are these people? Evaluating the demographic characteristics and political preferences of MTurk survey respondents. Research and Politics, July-September, 1-12
  • Keith, M. & Harms, P.D. (2016). Is Mechanical Turk the answer to our sampling woes?. Industrial and Organizational Pschology, 9( 1), 162-167.
  • Paolacci, G. & Chandler, J. (2014). Inside the Turk: Understanding mechanical Turk as a participant pool. Current Directions in Pscyhological Science, 23/3, 184-188.
  • Ross, J., Irani, L., Six Silberman, M., Zaldivar, A., & Thomlinson, B. (2010). Whoe are the crowdworkers? Shifting demographics in Amazon Mechanical Turk. Conference Preceedings of the 28th International Conference https://www.researchgate.net/publication/221519699_Who_Are_the_Crowdworkers_Shifting_Demograp hics_in_Mechanical_Turk. Human Factors in Computing Systems
  • Roulin, N. (2017). Don’t throw the baby out with the bathwater: Comparing Data quality of crowdsourcing, online https://doi.org/10.1017/iop.2015.24 student samples. 191-196.
  • https://www.cambridge.org/core/terms.
  • Saxton, G.D., Oh, O., & Kisher, R. (2013). Rules of crowdsourcing: Models, issues and systems of control. Information Systems Management, 30, 2-20.
  • Vakharia, D., & Lease, M. (2015). Beyond Mechanical Turk: An analysis of paid crowd work platforms. iConference 2015..1-17.
  • Whitla, P. (2009). Crowdsourcing and its applications in marketing activities. Contemporary Management Research, 5(1), 15-28.

BİR VERİ TOPLAMA PLATFORMU OLARAK CROWDSOURCING CS : MECHANICAL TURK ÖRNEĞİ ÜZERİNDEN BİR LİTERATÜR TARAMASI VE DEĞERLENDİRME

Year 2017, ICMEB17 Özel Sayısı, 975 - 982, 01.12.2017

Abstract

Pozitivist akademik çalışmalar teorilerin veya hipotezlerin test edilmesi temeline dayanır ve test sürecinde ise katılımcıların varlığı ve yapısı son derece önemlidir. Bu anlamda, geçmiş yıllarda öğrenciler, kolayda örnekleme yöntemi çerçevesinde, en yaygın kullanılan katılımcı grubu iken, son yıllarda Internet yeni katılımcı temini ve veri toplama aracı olarak yaygın olarak kullanılmaya başlamıştır. Bu bağlamda 2006 yılında Jeff Howe tarafından ortaya atılan Crowdsourcing CS kavramı son yıllarda giderek daha popular olmaya başlamıştır. Ancak beraberinde bir çok sorunu da gündeme taşımıştır. Basit ifadeyle crowdsourcing, bir araştırma bağlamında gerekli olan verilerin veya bilgi kaynaklarının çok sayıda kalabalık bir insan grubundan elde edilmesi şeklinde tanımlanabilir. Sosyal bilimler bağlamında ise gerekli olan verilerin internet ortamından çok sayıda katılımcısı bulunan belirli veri toplama platformları Amazon Mechanical Turk, Survey Monkey, TryMyUI, Facebook, Qualtricks vb. üzerinden toplanması şeklinde de değerlendirilebilir. Ancak sosyal bilimlerde bu platformlar aracılığıyla kolayda örnekleme yaklaşımıyla elde edilen verilerin örnekleme, güvenilirlik, geçerlilik, temsil kabiliyeti, istatistik analizler ve bulguların genellenebilirliği gibi çok sayıda sorunu öne çıkarmaktadır. Bu çalışmanın amacı, literatürde son yıllarda giderek yaygın kullanım bulan CS uygulamalarının artıları ve eksileri ile bu tür yollarla toplanmakta olan verilerin bilimsel çalışmalar bağlamındaki çıkmazları ve sorunları literatür taraması yaklaşımıyla eleştirel değerlendirmeler şeklinde ortaya konulmaya çalışılacaktır. Araştırma bağlamında, literatürde öne çıkan Amazon Mechanical Turk platformu temelinde tartışmalar ve değerlendirmeler yapılmaktadır.

References

  • Adda, G., & Cohen, K.B., (2017). Amazon mechanical Turk: Gold mine or coal mine?. Computational Lingustics, 37(2), 2-10.
  • Bakic, T.V., Kostic, S.C., & Neskovic, E. (2014). Applications of crowdsourcing in marketing. Revista Romana de Statistica – Supplement, 7, 7-16.
  • Bergman, M. E., & Jean, V.A.(2016). Where have all the “workers” gone? A critical analysis of the unrepresentativeness of our samples relative to the labor market in the industrial–organizational psychology literature. Industrial andOrganizational Psychology: Perspectives on Science and Practice, 9, 84–113.
  • Brabham, D.C. (2008). Crowdsourcing as a model for problem solving: An introduction and cases. Convergence: The International Journal of Research into New Media Technologies, 14(1), 75-90.
  • Buhrmester, M., Kwang, T., & Gosling, S.D. (2011). Amazon’s mechanical Turk: A new source of inexpensive, yet high-quality, data?. Perspective on Psychological Science, 6(1), 3-9.
  • Chandler, J., Mueller, P., & Paolacci, G. (2014). Nonnaïveté among Amazon mechanical Turk workers: Consequences and solutions for behavioral researchers. Behavior Research Methods, 46, 112–130.
  • Christenson, D.P., & Glick, D.M. (2013). Crowdsourcing panel studies and real time experiments in MTurk. The Political Methodologist, 20, 27-33.
  • Doan, A., Ramakrishnan, R., & Halevy, A.Y. (2011). Crowdsourcing systems on the world-wide web. Communications of the ACM, 54(4), 86-96.
  • Edgar, J., Murphy, J., & Keating, M. (2016). Comparing traditional and crowdsourcing methods for pretesting survey questions. SAGE Open, October-December, 1-16.
  • Estelles-Arolas, E. & Ladron-De-Guevara, F. (2012). Towards an integrated crowdsourcing definition. Journal of Information Science, 38(2), 189–200.
  • Goodman, J.K. & Paolacci, G. (2017). Crowdsourcing consumer research. Journal of Consumer Research, 44, February, 196-208.
  • Goodrich, R. (2013). What is crowdsourcing. Business News Daily, February 26, 1.
  • Google (2017). Crowdsourcing, Erişim Tarihi: 02.12.2017.
  • Harms, P. D., & DeSimone, J. A. (2015). Caution MTurk workers ahead—Fines doubled. Industrial and Organizational Psychology: Perspectives on Science and Practice, 8(2), 183–190.
  • Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The weirdest people in the world?. Behavioral and Brain Sciences, 33(2/3), 61–83.
  • Hitlin, P. (2016). Research in the crowdsourcing age, A Case Study. PewResearchCenter, July, 1-36.
  • Hook, L. (2016). The humans behind mechanical Turk’s artificial intelligence. Financial Times, October 26, 2016.
  • Hossain, M., & Kauranen, I. (2015). Crowdsourcing: a comprehensive literature review. Strategic Outsourcing: An International Journal, 8(1), 2-22, www.emeraldinsight.com/1753-8297.htm Howe, J. (2006). The http://www.wired.com/wired/archive/14.06/crowds_pr.html. Rise of
  • Crowdsourcing. Wired Magazine,14/6
  • Huff, C., & Tingley, D. (2015). Who are these people? Evaluating the demographic characteristics and political preferences of MTurk survey respondents. Research and Politics, July-September, 1-12
  • Keith, M. & Harms, P.D. (2016). Is Mechanical Turk the answer to our sampling woes?. Industrial and Organizational Pschology, 9( 1), 162-167.
  • Paolacci, G. & Chandler, J. (2014). Inside the Turk: Understanding mechanical Turk as a participant pool. Current Directions in Pscyhological Science, 23/3, 184-188.
  • Ross, J., Irani, L., Six Silberman, M., Zaldivar, A., & Thomlinson, B. (2010). Whoe are the crowdworkers? Shifting demographics in Amazon Mechanical Turk. Conference Preceedings of the 28th International Conference https://www.researchgate.net/publication/221519699_Who_Are_the_Crowdworkers_Shifting_Demograp hics_in_Mechanical_Turk. Human Factors in Computing Systems
  • Roulin, N. (2017). Don’t throw the baby out with the bathwater: Comparing Data quality of crowdsourcing, online https://doi.org/10.1017/iop.2015.24 student samples. 191-196.
  • https://www.cambridge.org/core/terms.
  • Saxton, G.D., Oh, O., & Kisher, R. (2013). Rules of crowdsourcing: Models, issues and systems of control. Information Systems Management, 30, 2-20.
  • Vakharia, D., & Lease, M. (2015). Beyond Mechanical Turk: An analysis of paid crowd work platforms. iConference 2015..1-17.
  • Whitla, P. (2009). Crowdsourcing and its applications in marketing activities. Contemporary Management Research, 5(1), 15-28.
There are 28 citations in total.

Details

Primary Language Turkish
Subjects Business Administration
Journal Section Research Article
Authors

Remzi Altunışık This is me

Publication Date December 1, 2017
Submission Date October 17, 2017
Published in Issue Year 2017 ICMEB17 Özel Sayısı

Cite

APA Altunışık, R. (2017). BİR VERİ TOPLAMA PLATFORMU OLARAK CROWDSOURCING CS : MECHANICAL TURK ÖRNEĞİ ÜZERİNDEN BİR LİTERATÜR TARAMASI VE DEĞERLENDİRME. Uluslararası Yönetim İktisat Ve İşletme Dergisi, 13(13), 975-982.