Review Article
BibTex RIS Cite

Software Solutions For Web-Based Experiments: A Comprehensive Review For Cognitive and Performance Research

Year 2025, Volume: 5 Issue: 1, 70 - 83

Abstract

The rapid growth of experimental software packages presents both opportunities and challenges for psychology and cognitive science researchers. These tools offer flexible study designs, but the increasing variety complicates software selection. This review provides a practical guide by comparing major experiment-building platforms across key criteria: timing precision, stimulus presentation, usability, and implementation demands. We systematically evaluate both lab-based and web-based software, highlighting their strengths and limitations. Lab-based systems generally deliver superior experimental control, while modern web platforms have significantly improved reliability and flexibility. We recommend PsychToolbox and PsychoPy for studies requiring maximum timing precision, E-Prime for clinical and applied settings, and Gorilla or jsPsych for online data collection. Our analysis emphasizes aligning software choice with research context, considering technical expertise, participant accessibility, and experiment complexity. PsychToolbox and PsychoPy are suited for advanced programmers, whereas E-Prime and Gorilla offer accessible solutions for researchers with limited coding skills. For complex behavioral paradigms, PsychoPy and jsPsych provide versatile options. This review functions as both a comparative analysis and practical handbook, enabling researchers to select appropriate software tailored to their experimental needs. By synthesizing performance benchmarks and implementation considerations, we deliver actionable recommendations to optimize study design across diverse research scenarios.

Thanks

Thank you in advance for your cosideration and time.

References

  • Anwyl-Irvine, A. L., Dalmaijer, E. S., Hodges, N., & Evershed, J. K. (2020a). Online Timing Accuracy and Precision: A comparison of platforms, browsers, and participant’s devices.
  • Bohannon, J. (2016). Mechanical Turk upends social sciences. Science, 352(6291), 1263–1264. doi: 10.1126/science.352.6291.1263.
  • Kingstone, A., Smilek, D., & Eastwood, J. D. (2008). Cognitive Ethology: A new approach for studying human cognition. British Journal of Psychology, 99(3), 317-443. doi: 10.1348/000712607X251243
  • Willis, G. B. (1997). The use of the psychological laboratory to study sensitive survey topics. In: L. Harrison, & A. Hughes (Eds), The Validity of Self-reported Drug Use: Improving the Accuracy of Survey Estimates (pp. 416–438). Rockville, MD: National Institute on Drug Abuse.
  • Jobe, J. B. (2003). Cognitive psychology and self-reports: Models and methods. Quality of life research, 12(3), 219-227.doi: 10.1023/a:1023279029852
  • Krantz, J. H., Ballard, J., & Scher, J. (1997). Comparing the results of laboratory and World-Wide Web samples on the determinants of female attractiveness. Behavior Research Methods, Instruments, & Computers, 29, 264–269.doi: 10.3758/BF03204824
  • Buchanan, T., & Smith, J. L. (1999). Using the Internet for psychological research: Personality testing on the World Wide Web. British Journal of Psychology, 90, 125–144.doi: 10.1348/000712699161189
  • Buchanan, T., Johnson, J. A., & Goldberg, L. (2005). Implementing a Five-Factor personality inventory for use on the Internet. European Journal of Psychological Assessment, 21(2), 115–127.doi: 10.1027/1015-5759.21.2.115
  • Buchanan, T. (2007). Personality testing on the Internet: What we know, and what we do not. In A. N. Joinson, K. Y. McKenna, T. Postmes, & U.-D. Reips (Eds.), The Oxford handbook of Internet psychology.
  • Gosling, S. D., Vazire, S., Srivastava, S., & John, O. P. (2004). Should We Trust Web-Based Studies? A Comparative Analysis of Six Preconceptions About Internet Questionnaires. American Psychologist, 59(2), 93–104.doi: 10.1037/0003-066X.59.2.93
  • Birnbaum, M. H. (Ed.), (2000). Psychological experiments on the Internet. Academic Press.
  • McGraw, K. O., Tew, M. D., & Williams, J. E. (2000). The integrity of Web-delivered experiments: Can you trust the data? Psychological Science, 11(6), 502–506.doi: 10.1111/1467-9280.00296
  • Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y., & Plumb, I. (2001). The “Reading the Mind in the Eyes” Test Revised Version: A Study with Normal Adults, and Adults with Asperger Syndrome or High-functioning Autism. Journal of Child Psychology and Psychiatry, 42(2), 241–251.doi: 10.1017/S0021963001006643
  • Duchaine, B., & Nakayama, K. (2006). The Cambridge Face Memory Test: Results for neurologically intact individuals and an investigation of its validity using inverted face stimuli and prosopagnosic participants. Neuropsychologia, 44(4), 576–585. doi: 10.1016/j.neuropsychologia.2005.07.001
  • Wilmer, J. B., Germine, L., Chabris, C. F., Chatterjee, G., Williams, M., Loken, E., Nakayama, K., Duchaine, B. (2010). Human face recognition ability is specific and highly heritable. Proceedings of the National Academy of Sciences, 107(11), 5238–5241. doi: 10.1073/pnas.0913053107
  • Germine, L., Nakayama, K., Duchaine, B. C., Chabris, C. F., Chatterjee, G., & Wilmer, J. B. (2012). Is the Web as good as the lab? Comparable performance from Web and lab in cognitive/perceptual experiments. Psychonomic Bulletin & Review, 19(5), 847–857. doi: 10.3758/s13423-012-0296-9
  • Kriegeskorte, N., & Douglas, P. K. (2018). Cognitive computational neuroscience. Nature Neuroscience, 21, 1148–1160. doi: 10.1038/s41593-018-0210-5
  • Bantin, T., Stevens, S. & Gerlach, A. L., Hermann, C. (2016). What does the facial dot probe task tell us about attentional processes in social anxiety? A systematic review. Journal of Behavior Therapy and Experimental Psychiatry, 50, 40-51. doi: 10.1016/j.jbtep.2015.04.009
  • Gillan, M. C., Kosinski, M., Whelan, R., Phelps, E. A. & Daw, N. D. (2016a). Characterizing a psychiatric symptom dimension related to deficits in goal-directed control. eLife.
  • Daw, N. D., Gershman, S. J., Seymour, B., Dayan, P. & Dolan, R. J. (2011). Model-Based Influences on Humans' Choices and Striatal Prediction Errors. Neuron, 69(6), 1204-1215.doi: 10.1016/j.neuron.2011.02.027
  • Gillan, M. C. & Daw, N. D. (2016b). Taking Psychiatry Research Online. Neuron, 91(1), 19-23.
  • Gillan, M. C., Otto, A. R., Phelps, E. A. & Daw, N. D. (2015). Model-based learning protects against forming habits. Cognitive, Effective & Behavioral Neuroscience, 15, 523-536. doi: 10.3758/s13415-015-0347-6
  • Doğan, B. (2009) Multiple-choice reaction and visual perception in female and male elite athletes. Journal of Sports Medicine and Physical Fitness, 49(1), 91-96.
  • Kokubu, M., Ando, S., Kida, N., & Oda, S. (2006). Interference Effects between Saccadic and Key-Press Reaction Times of Volleyball Players and Nonathletes. Perceptual and Motor Skills, 103(3), 709–716. doi: 10.2466/pms.103.3.709-716
  • Aglioti, S. M., Cesari, P., Romani, M., & Urgesi, C. (2008). Action anticipation and motor resonance in elite basketball players. Nature Neuroscience, 11(9), 1109–1116. doi: 10.1038/nn.2182
  • Fissler, P., Küster, O., Schlee, W., & Kolassa, I.-T. (2013). Novelty Interventions to Enhance Broad Cognitive Abilities and Prevent Dementia. Progress in Brain Research, 207, 403–434. doi: 10.1016/B978-0-444-63327-9.00017-5
  • Taatgen, N. A. (2013). The nature and transfer of cognitive skills. Psychological Review, 120(3), 439–471. doi: 10.1037/a0033138
  • Stahl, C. (2006). Software for Generating Psychological Experiments. Experimental Psychology, 53(3), 218–232. doi: 10.1027/1618-3169.53.3.218
  • Anvari, F., Efendić, E., Olsen, J., Arslan, R. C., Elson, M., & Schneider, I. K. (2022). Bias in Self-Reports: An Initial Elevation Phenomenon. Social Psychological and Personality Science, 14(6), 727-737. doi: 10.1177/19485506221129160
  • Sauter, M., Stefani, M., & Mack, W. (2022). Equal Quality for Online and Lab Data: A Direct Comparison from Two Dual-Task Paradigms. Open Psychology, 4(1), 47–59.
  • Gagné, N., & Franzen, L. (2023). How to Run Behavioural Experiments Online: Best Practice Suggestions for Cognitive Psychology and Neuroscience. Swiss Psychology Open, 3(1), 1–21. doi: 10.5334/spo.34
  • Uittenhove, K., Jeanneret, S., & Vergauwe, E. (2023). From Lab-Testing to Web-Testing in Cognitive Research: Who You Test is More Important than how You Test. Journal of Cognition, 6(1): 13, 1–17. doi: 10.5334/joc.259
  • Anwyl-Irvine, A. L., Dalmaijer, E. S., Hodges, N., & Evershed, J. K. (2021). Realistic precision and accuracy of online experiment platforms, web browsers, and devices. Behavior Research Methods, 53, 1407–1425. doi: 10.3758/s13428-020-01501-5
  • Plant R. R. (2016). A reminder on millisecond timing accuracy and potential replication failure in computer-based psychology experiments: an open letter. Behavior Research Methods 48(1), 408–411.doi: 10.3758/s13428-015-0577-0
  • Bridges, D., Pitiot, A., MacAskill, M. R. & Peirce, J. W. (2020). The timing mega-study: Comparing a range of experiment generators, both lab-based and online. PeerJ, 8.
  • Dandurand, F., Shultz, T. R. & Onishi, K. H. (2008). Comparing online and lab methods in a problem solving experiment. Behavior Research Methods, 40, 428–434.doi: 10.3758/brm.40.2.428
  • Erdoğdu, M., Artuner, H., Demirbaş, H., Aytaç, G. & Karasırt D. (2022). Sporcu Sağlığında Güncel Yaklaşım: Bilgisayar Tabanlı Psikolojik Ölçümler. Türkiye Sağlık Enstitüleri Başkanlığı Dergisi, 5(3), 43-55. doi: 10.54537/tusebdergisi.1173181
  • Erdoğdu, M., Aytaç, G., Deliceoğlu, G., (2023). Bilgisayar Temelli Uygulamalar ile Sporcularda Dikkat ve Alt Bileşenlerinin Tespit Edilmesi: Bir Labaratuvar Çalışması. Bilgi Teknolojileri ve İletişim Dergisi, 1(1), 117-146.
  • Peirce, J. W. (2009). Generating stimuli for neuroscience using PsychoPy. Frontiers in Neuroinformatics, 2(10), 1-8. doi: 10.3389/neuro.11.010.2008
  • Peirce, J. W. (2007). PsychoPy—Psychophysics software in Python. Journal of Neuroscience Methods, 162, 8–13. doi: 10.1016/j.jneumeth.2006.11.017
  • Garaizar, P. & Vadillo, M. A. (2014). Accuracy and Precision of Visual Stimulus Timing in PsychoPy: No Timing Errors in Standard Usage. PLoS ONE, 9(11), e112033.
  • Lin, Z., Yang, Z., Feng, C. & Zhang, Y. (2022). PsyBuilder: An Open-Source, Cross-Platform Graphical Experiment Builder for Psychtoolbox With Built-In Performance Optimization. Association for Psychological Science, 5(1), 1-20. doi: 10.1177/25152459211070573
  • Lorca, M. H. & Capilla, A. (2018). Psychtoolbox. A Brief Guide To Start Programming Experiments In Psychology. UAM Ediciones.
  • Hairston, W. D. & Maldjian, J. A. (2009). An adaptive staircase procedure for the E-Prime programming environment. Computer Methods and Programs In Biomedicine, 93, 104-108. doi: 10.1016/j.cmpb.2008.08.003
  • Clercq, A., Grombez, G., Buysse, A. & Roeyers, H. (2003). A simple and sensitive method to measure timing accuracy. Behavior Research Methods, Instruments, & Computers, 35(1), 109-115. doi: 10.3758/BF03195502
  • Ragozzine, F. (2002). SuperLab LT: Evaluation and Uses in Teaching Experimental Psychology. Teaching of Psychology, 29(3), 251-254.
  • Haxby, J. V., Parasuraman, R., Lalonde, F. & Abboud, H. (1993). SuperLab: General-purpose Macintosh software for human experimental psychology and psychological testing. Behavior Research Methods, Instruments, & Computers, 25(3), 400-405. doi: 10.3758/BF03204531
  • Vaz-Rebelo, P., Otero, J., Costa, C., Morgado, J. & Ishiwa, K. (2014). Questioning About Science Texts and Reading Time Through The Software SUPERLAB. Social and Behavioral Sciences, 159, 620 – 624. doi: 10.1016/j.sbspro.2014.12.436
  • Krause, F., & Lindemann, O. (2014). Expyriment: A Python library for cognitive and neuroscientific experiments. Behavior Research Methods, 46(2), 416–428.
  • Mathôt, S., Schreij, D., & Theeuwes, J. (2012). OpenSesame: An open-source, graphical experiment builder for the social sciences. Behavior Research Methods, 44(2), 314-324. doi: 10.3758/s13428-011-0168-7
  • Anwyl-Irvine, A. L., Massonnié, J., Flitton, A., Kirkham, N., & Evershed, J. K. (2020b). Gorilla in our midst: An online behavioral experiment builder. Behavior Research Methods, 52, 388–407. doi: 10.3758/s13428-019-01237-x
  • Eden, E., Navon, R., Steinfeld, I., Lipson, D. & Yakhini, Z. (2009). GOrilla: a tool for discovery and visualization of enriched GO terms in ranked gene lists. BMC Bioinformatics, 10(48), 1-7. doi: 10.1186/1471-2105-10-48
  • de Leeuw, J. R. (2015). jsPsych: A JavaScript library for creating behavioral experiments in a Web browser. Behavior Research Methods, 47, 1–12. doi: 10.3758/s13428-014-0458-y
  • Pinet, S., Zielinski, C., Mathôt, S., Dufau, S., Alario, F.-X., & Longcamp, M. (2017). Measuring sequences of keystrokes with jsPsych: Reliability of response times and interkeystroke intervals. Behavior Research Methods, 49(3), 1163–1176.doi: 10.3758/s13428-016-0776-3
  • Henninger, F., Shevchenko, Y., Mertens, U. K., Kieslich, P. J. & Hilbig, B. J. (2022). lab.js: A free, open, online study builder. Behavior Research Methods, 54, 556- 573. doi: 10.3758/s13428-019-01283-5
  • Straw, A. D. (2008). Vision Egg: An open-source library for realtime visual stimulus generation. Frontiers in Neuroinformatics, 2(4), 1–10. doi: 10.3389/neuro.11.004.2008
  • Pandian, V. P. S. & Suleri, S. (2020). BlackBox Toolkit: Intelligent Assistance to UI Design. Art, Computer Design ArXiv, 25–30. doi: 10.48550/arXiv.2004.01949
  • Yoonessi, A. & Yoonessi, A. (2011). A Glance at Psychophysics Software Programs. Basic and Clinical Neuroscience, 2(3), 73-75.
There are 58 citations in total.

Details

Primary Language English
Subjects Information Systems User Experience Design and Development, Performance Evaluation, Programming Languages, Software Engineering (Other)
Journal Section Reviews
Authors

Beyzanur Karagüzel 0009-0003-5690-4818

Merve Erdoğdu 0000-0002-3745-2639

Early Pub Date August 21, 2025
Publication Date November 11, 2025
Submission Date February 23, 2025
Acceptance Date July 20, 2025
Published in Issue Year 2025 Volume: 5 Issue: 1

Cite

APA Karagüzel, B., & Erdoğdu, M. (2025). Software Solutions For Web-Based Experiments: A Comprehensive Review For Cognitive and Performance Research. Journal of Emerging Computer Technologies, 5(1), 70-83.
Journal of Emerging Computer Technologies
is indexed and abstracted by
Harvard Hollis, Scilit, ROAD, Google Scholar, OpenAIRE

Publisher
Izmir Academy Association

88x31.png