• Media type: E-Article
  • Title: Bias does not equal bias : a socio-technical typology of bias in data-based algorithmic systems
  • Contributor: Lopez, Paola [VerfasserIn]
  • imprint: 2021
  • Published in: Internet policy review ; 10(2021), 4, Seite 1-29
  • Language: English
  • DOI: 10.14763/2021.4.1598
  • ISSN: 2197-6775
  • Identifier:
  • Keywords: Artificial intelligence ; Machine learning ; Bias ; Aufsatz in Zeitschrift
  • Origination:
  • Footnote:
  • Description: This paper introduces a socio-technical typology of bias in data-driven machine learning and artificial intelligence systems. The typology is linked to the conceptualisations of legal anti-discrimination regulations, so that the concept of structural inequality-and, therefore, of undesirable bias-is defined accordingly. By analysing the controversial Austrian "AMS algorithm" as a case study as well as examples in the contexts of face detection, risk assessment and health care management, this paper defines the following three types of bias: firstly, purely technical bias as a systematic deviation of the datafied version of a phenomenon from reality; secondly, socio-technical bias as a systematic deviation due to structural inequalities, which must be strictly distinguished from, thirdly, societal bias, which depicts-correctly-the structural inequalities that prevail in society. This paper argues that a clear distinction must be made between different concepts of bias in such systems in order to analytically assess these systems and, subsequently, inform political action.
  • Access State: Open Access
  • Rights information: Attribution (CC BY)