• Media type: Text; Electronic Conference Proceeding
  • Title: Crowdsourcing Quality of Experience Experiments
  • Contributor: Egger-Lampl, Sebastian [Author]; Redi, Judith [Author]; Hoßfeld, Tobias [Author]; Hirth, Matthias [Author]; Möller, Sebastian [Author]; Naderi, Babak [Author]; Keimel, Christian [Author]; Saupe, Dietmar [Author]
  • Published: KOPS - The Institutional Repository of the University of Konstanz, 2017-09-28
  • Language: English
  • DOI: https://doi.org/10.1007/978-3-319-66435-4_7
  • Origination:
  • Footnote: Diese Datenquelle enthält auch Bestandsnachweise, die nicht zu einem Volltext führen.
  • Description: Crowdsourcing enables new possibilities for QoE evaluation by moving the evaluation task from the traditional laboratory environment into the Internet, allowing researchers to easily access a global pool of workers for the evaluation task. This makes it not only possible to include a more diverse population and real-life environments into the evaluation, but also reduces the turn-around time and increases the number of subjects participating in an evaluation campaign significantly, thereby circumventing bottle-necks in traditional laboratory setups. In order to utilise these advantages, the differences between laboratory-based and crowd-based QoE evaluation are discussed in this chapter. ; published
  • Access State: Open Access
  • Rights information: In Copyright