• Media type: E-Book
  • Title: Apparent Algorithmic Discrimination and Real-Time Algorithmic Learning
  • Contributor: Lambrecht, Anja [Author]; Tucker, Catherine E. [Other]
  • Published: [S.l.]: SSRN, [2020]
  • Extent: 1 Online-Ressource (28 p)
  • Language: English
  • DOI: 10.2139/ssrn.3570076
  • Identifier:
  • Origination:
  • Footnote: Nach Informationen von SSRN wurde die ursprüngliche Fassung des Dokuments November 4, 2020 erstellt
  • Description: An important concern is that algorithms can inadvertently discriminate against minority groups and reinforce existing inequality. Typically, the worry is that when classification algorithms are trained on a dataset that itself reflects bias this may reinforce bias. However, in the world of digital content many algorithms are making judgements in real time to determine what content will be engaging. We revisit the context of a classic study which documents that searches on Google for black names were more likely to return ads that highlighted the need for a criminal background check than searches for white names. We document that one explanation for this finding is that if an algorithm receives in real time less data about one group, it will learn at different speeds. Since black names are less common, the algorithm learns about the quality of the underlying ad more slowly, and as a result an ad, including an undesirable ad, is more likely to persist for searches next to black names even if the algorithm judges the ad to be of low-quality. We extend this result by presenting evidence that ads targeted towards searches for religious groups persist for longer for religious groups that are less searched for. This suggests that the process of real-time algorithmic learning can lead to differential outcomes across those whose characteristics are more common and those who are rarer in society
  • Access State: Open Access