• Medientyp: E-Book
  • Titel: The Digital Experiment Reporting Protocol (DERP)
  • Beteiligte: Weinmann, Markus [VerfasserIn]; Schneider, Christoph [VerfasserIn]; Valacich, Joseph [VerfasserIn]
  • Erschienen: [S.l.]: SSRN, [2022]
  • Umfang: 1 Online-Ressource (81 p)
  • Sprache: Englisch
  • DOI: 10.2139/ssrn.3985991
  • Identifikator:
  • Schlagwörter: Methods ; digital experiments ; reporting ; reviewing
  • Entstehung:
  • Anmerkungen: Nach Informationen von SSRN wurde die ursprüngliche Fassung des Dokuments December 15, 2021 erstellt
  • Beschreibung: Digital experiments, experiments, which examine an IT artifact and/or largely use information technology (IT) for stimulus presentation and/or response collection, have become a widely used research method for establishing cause­-and­-effect relationships. Many experiments, however, have been criticized for lacking reproducibility, which can have serious consequences. For example, nonreplicable experiments may lead practitioners to base their decisions on misleading evidence, thereby harming the reputation of science. To improve the replicability of experiments, authors need to report methods and results in a clear, transparent, and comprehensive manner. Hence, to improve the reporting of digital experiments, we have drawn upon the prior literature to develop a reporting checklist and associated guidelines. Specifically, we have drawn on the Consolidated Standards of Reporting Trials (CONSORT), which has been endorsed by more than 600 academic journals. However, because CONSORT was developed to improve the reporting of clinical trials when evaluating the effectiveness and safety of medications or medical devices, it lacks the items and guidelines necessary to comprehensively report digital experiments. As such, we propose the Digital Experiments Reporting Protocol (DERP), which adapts CONSORT as a suitable guide for reporting digital experiments. Applying our checklist to top­-tier information systems articles (from Information Systems Research and MIS Quarterly), we found that on average, only 39% report core items in sufficient detail, highlighting the need for a standardized checklist. As such, our checklist provides a valuable tool with which authors and reviewers can improve both study reporting and replicability
  • Zugangsstatus: Freier Zugang