• Media type: Electronic Conference Proceeding
  • Title: One shot crowdtesting: approaching the extremes of crowdsourced subjective quality testing
  • Contributor: Seufert, Michael [Author]; Hoßfeld, Tobias [Author]
  • imprint: Augsburg University Publication Server (OPUS), 2023-09-06
  • Language: English
  • DOI: https://doi.org/10.21437/PQS.2016-26
  • Origination:
  • Footnote: Diese Datenquelle enthält auch Bestandsnachweise, die nicht zu einem Volltext führen.
  • Description: Crowdsourcing studies for subjective quality testing have become a particularly useful tool for Quality of Experience researchers. Typically, crowdsouring studies are conducted by many unsupervised workers, which rate the perceived quality of several test conditions during one session (mixed within-subject test design). However, those studies often show to be very sensitive, for example, to test instructions, design, and filtering of unreliable participants. Moreover, the exposure of several test conditions to single workers potentially leads to an implicit training and anchoring of ratings. Therefore, this works investigates the extreme case of presenting only a single test condition to each worker (completely between-subjects test design). The results are compared to a typical crowdsourcing study design with multiple test conditions to discuss training effects in crowdsourcing studies. Thus, this work investigates if it is possible to use a simple 'one shot' design with only one rating of a large number of workers instead of sophisticated (mixed or within-subject) test designs in crowdsourcing.
  • Access State: Open Access