IDEAS home Printed from https://ideas.repec.org/a/spr/jesaex/v6y2020i1d10.1007_s40881-020-00087-0.html
   My bibliography  Save this article

LIONESS Lab: a free web-based platform for conducting interactive experiments online

Author

Listed:
  • Marcus Giamattei

    (University of Passau
    University of Nottingham
    Bard College Berlin)

  • Kyanoush Seyed Yahosseini

    (Max Planck Institute for Human Development)

  • Simon Gächter

    (University of Nottingham
    Institute of Labour Economics
    Center for Economic Studies)

  • Lucas Molleman

    (University of Nottingham
    Max Planck Institute for Human Development
    University of Amsterdam)

Abstract

LIONESS Lab is a free web-based platform for interactive online experiments. An intuitive, user-friendly graphical interface enables researchers to develop, test, and share experiments online, with minimal need for programming experience. LIONESS Lab provides solutions for the methodological challenges of interactive online experimentation, including ways to reduce waiting time, form groups on-the-fly, and deal with participant dropout. We highlight key features of the software, and show how it meets the challenges of conducting interactive experiments online.

Suggested Citation

  • Marcus Giamattei & Kyanoush Seyed Yahosseini & Simon Gächter & Lucas Molleman, 2020. "LIONESS Lab: a free web-based platform for conducting interactive experiments online," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 6(1), pages 95-111, June.
  • Handle: RePEc:spr:jesaex:v:6:y:2020:i:1:d:10.1007_s40881-020-00087-0
    DOI: 10.1007/s40881-020-00087-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s40881-020-00087-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s40881-020-00087-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Jérôme Hergueux & Nicolas Jacquemet, 2015. "Social preferences in the online laboratory: a randomized experiment," Experimental Economics, Springer;Economic Science Association, vol. 18(2), pages 251-283, June.
    2. repec:hal:pseose:halshs-00984211 is not listed on IDEAS
    3. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    4. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    5. Jonathan Quidt & Francesco Fallucchi & Felix Kölle & Daniele Nosenzo & Simone Quercia, 2017. "Bonus versus penalty: How robust are the effects of contract framing?," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 3(2), pages 174-182, December.
    6. Katrin Schmelz & Anthony Ziegelmeyer, 2015. "Social Distance and Control Aversion: Evidence from the Internet and the Laboratory," TWI Research Paper Series 100, Thurgauer Wirtschaftsinstitut, Universität Konstanz.
    7. James Pettit & Daniel Friedman & Curtis Kephart & Ryan Oprea, 2014. "Software for continuous game experiments," Experimental Economics, Springer;Economic Science Association, vol. 17(4), pages 631-648, December.
    8. Chan, Shu Wing & Schilizzi, Steven & Iftekhar, Md Sayed & Da Silva Rosa, Raymond, 2019. "Web-based experimental economics software: How do they compare to desirable features?," Journal of Behavioral and Experimental Finance, Elsevier, vol. 23(C), pages 138-160.
    9. Simon Gächter & Lingbo Huang & Martin Sefton, 2016. "Combining “real effort” with induced effort costs: the ball-catching task," Experimental Economics, Springer;Economic Science Association, vol. 19(4), pages 687-712, December.
    10. Molnar, Andras, 2019. "SMARTRIQS: A Simple Method Allowing Real-Time Respondent Interaction in Qualtrics Surveys," Journal of Behavioral and Experimental Finance, Elsevier, vol. 22(C), pages 161-169.
    11. Giamattei, Marcus & Lambsdorff, Johann Graf, 2019. "classEx — an online tool for lab-in-the-field experiments with smartphones," Journal of Behavioral and Experimental Finance, Elsevier, vol. 22(C), pages 223-231.
    12. repec:cup:judgdm:v:5:y:2010:i:5:p:411-419 is not listed on IDEAS
    13. Chen, Daniel L. & Schonger, Martin & Wickens, Chris, 2016. "oTree—An open-source platform for laboratory, online, and field experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 9(C), pages 88-97.
    14. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    15. Urs Fischbacher, 2007. "z-Tree: Zurich toolbox for ready-made economic experiments," Experimental Economics, Springer;Economic Science Association, vol. 10(2), pages 171-178, June.
    16. Lucas Molleman & Felix Kölle & Chris Starmer & Simon Gächter, 2019. "People prefer coordinated punishment in cooperative interactions," Nature Human Behaviour, Nature, vol. 3(11), pages 1145-1153, November.
    17. Erik Snowberg & Leeat Yariv, 2018. "Testing the Waters: Behavior across Participant Pools," NBER Working Papers 24781, National Bureau of Economic Research, Inc.
    18. Marcus R. Munafò & Brian A. Nosek & Dorothy V. M. Bishop & Katherine S. Button & Christopher D. Chambers & Nathalie Percie du Sert & Uri Simonsohn & Eric-Jan Wagenmakers & Jennifer J. Ware & John P. A, 2017. "A manifesto for reproducible science," Nature Human Behaviour, Nature, vol. 1(1), pages 1-9, January.
    19. Yariv, Leeat & Snowberg, Erik, 2018. "Testing the Waters: Behavior across Participant Pools," CEPR Discussion Papers 13015, C.E.P.R. Discussion Papers.
    20. Erik Snowberg & Leeat Yariv, 2018. "Testing the Waters: Behavior across Participant Pools," CESifo Working Paper Series 7136, CESifo.
    21. Neil Stewart & Jesse Chandler & Gabriele Paolacci, "undated". "Crowdsourcing Samples in Cognitive Science," Mathematica Policy Research Reports c63e922cf7554604a919e1f18, Mathematica Policy Research.
    22. Berinsky, Adam J. & Huber, Gregory A. & Lenz, Gabriel S., 2012. "Evaluating Online Labor Markets for Experimental Research: Amazon.com's Mechanical Turk," Political Analysis, Cambridge University Press, vol. 20(3), pages 351-368, July.
    23. Siddharth Suri & Duncan J Watts, 2011. "Cooperation and Contagion in Web-Based, Networked Public Goods Experiments," PLOS ONE, Public Library of Science, vol. 6(3), pages 1-18, March.
    24. Akihiro Nishi & Hirokazu Shirado & David G. Rand & Nicholas A. Christakis, 2015. "Inequality and visibility of wealth in experimental social networks," Nature, Nature, vol. 526(7573), pages 426-429, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Antonio A. Arechar & Simon Gächter & Lucas Molleman, 2018. "Conducting interactive experiments online," Experimental Economics, Springer;Economic Science Association, vol. 21(1), pages 99-131, March.
    2. Gary Bolton & Eugen Dimant & Ulrich Schmidt, 2018. "When a Nudge Backfires. Using Observation with Social and Economic Incentives to Promote Pro-Social Behavior," PPE Working Papers 0017, Philosophy, Politics and Economics, University of Pennsylvania.
    3. Gagnon, Nickolas & Bosmans, Kristof & Riedl, Arno, 2020. "The Effect of Unfair Chances and Gender Discrimination on Labor Supply," Research Memorandum 005, Maastricht University, Graduate School of Business and Economics (GSBE).
    4. Stefano DellaVigna & Devin Pope, 2022. "Stability of Experimental Results: Forecasts and Evidence," American Economic Journal: Microeconomics, American Economic Association, vol. 14(3), pages 889-925, August.
    5. Gary E. Bolton & Eugen Dimant & Ulrich Schmidt, 2020. "When a Nudge Backfires: Combining (Im)Plausible Deniability with Social and Economic Incentives to Promote Behavioral Change," CESifo Working Paper Series 8070, CESifo.
    6. Hyndman, Kyle & Walker, Matthew J., 2022. "Fairness and risk in ultimatum bargaining," Games and Economic Behavior, Elsevier, vol. 132(C), pages 90-105.
    7. Holger Herz & Deborah Kistler & Christian Zehnder & Christian Zihlmann, 2022. "Hindsight Bias and Trust in Government," CESifo Working Paper Series 9767, CESifo.
    8. Chan, Shu Wing & Schilizzi, Steven & Iftekhar, Md Sayed & Da Silva Rosa, Raymond, 2019. "Web-based experimental economics software: How do they compare to desirable features?," Journal of Behavioral and Experimental Finance, Elsevier, vol. 23(C), pages 138-160.
    9. Herz, Holger & Kistler, Deborah & Zehnder, Christian & Zihlmann, Christian, 2022. "Hindsight Bias and Trust in Government: Evidence from the United States," FSES Working Papers 526, Faculty of Economics and Social Sciences, University of Freiburg/Fribourg Switzerland.
    10. Bicchieri, Cristina & Dimant, Eugen & Xiao, Erte, 2021. "Deviant or wrong? The effects of norm information on the efficacy of punishment," Journal of Economic Behavior & Organization, Elsevier, vol. 188(C), pages 209-235.
    11. Dato, Simon & Feess, Eberhard & Nieken, Petra, 2019. "Lying and reciprocity," Games and Economic Behavior, Elsevier, vol. 118(C), pages 193-218.
    12. Wladislaw Mill & Cornelius Schneider, 2023. "The Bright Side of Tax Evasion," CESifo Working Paper Series 10615, CESifo.
    13. Prissé, Benjamin & Jorrat, Diego, 2022. "Lab vs online experiments: No differences," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 100(C).
    14. Dimant, Eugen & van Kleef, Gerben A. & Shalvi, Shaul, 2020. "Requiem for a Nudge: Framing effects in nudging honesty," Journal of Economic Behavior & Organization, Elsevier, vol. 172(C), pages 247-266.
    15. Engelmann, Dirk & Janeba, Eckhard & Mechtenberg, Lydia & Wehrhöfer, Nils, 2023. "Preferences over taxation of high-income individuals: Evidence from a survey experiment," European Economic Review, Elsevier, vol. 157(C).
    16. Nicolas Jacquemet & Alexander G James & Stéphane Luchini & James J Murphy & Jason F Shogren, 2021. "Do truth-telling oaths improve honesty in crowd-working?," PLOS ONE, Public Library of Science, vol. 16(1), pages 1-18, January.
    17. Hans-Theo Normann & Till Requate & Israel Waichman, 2014. "Do short-term laboratory experiments provide valid descriptions of long-term economic interactions? A study of Cournot markets," Experimental Economics, Springer;Economic Science Association, vol. 17(3), pages 371-390, September.
    18. Sönke Ehret & Sara M. Constantino & Elke U. Weber & Charles Efferson & Sonja Vogt, 2022. "Group Identities Make Fragile Tipping Points," CESifo Working Paper Series 9737, CESifo.
    19. Salvatore Nunnari & Giovanni Montari, 2019. "Audi Alteram Partem: An Experiment on Selective Exposure to Information," Working Papers 650, IGIER (Innocenzo Gasparini Institute for Economic Research), Bocconi University.
    20. Andreas C. Drichoutis & Veronika Grimm & Alexandros Karakostas, 2020. "Bribing to Queue-Jump: An experiment on cultural differences in bribing attitudes among Greeks and Germans," Working Papers 2020-2, Agricultural University of Athens, Department Of Agricultural Economics.

    More about this item

    Keywords

    Experimental software; Interactive online experiments; Experimental standards;
    All these keywords.

    JEL classification:

    • C90 - Mathematical and Quantitative Methods - - Design of Experiments - - - General

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jesaex:v:6:y:2020:i:1:d:10.1007_s40881-020-00087-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.