• Media type: E-Book
  • Title: Algorithmic Fairness and Service Failures : Why Firms Should Want Algorithmic Accountability
  • Contributor: Ukanwa, Kalinda [VerfasserIn]; Rand, William [VerfasserIn]; Zubcsek, Peter Pal [VerfasserIn]
  • imprint: [S.l.]: SSRN, 2022
  • Published in: USC Marshall School of Business Research Paper Sponsored by iORB
  • Extent: 1 Online-Ressource (25 p)
  • Language: English
  • DOI: 10.2139/ssrn.4148214
  • Identifier:
  • Keywords: algorithmic accountability ; algorithmic fairness ; service failure ; word-of-mouth ; social influence ; diffusion
  • Origination:
  • Footnote: Nach Informationen von SSRN wurde die ursprüngliche Fassung des Dokuments June 27, 2022 erstellt
  • Description: The past years have witnessed growing consumer concern about the fairness implications of the widespread adoption of AI in business contexts. To protect consumers against bias from algorithmic service decisions, regulators have introduced legislation holding firms accountable for the fairness of their algorithmic decisions. However, regulators have yet to invest in the systematic monitoring of algorithmic fairness. Our research reveals the unintended consequences of this policy: we show that the resulting lack of algorithmic accountability inhibits consumers’ ability to assess firms’ degree of compliance with algorithmic fairness criteria, thereby reducing firms’ ability to manage consumer expectations. We posit that consumers gather information about firm actions from their immediate social network. We model how beliefs of bias may propagate within the market – even if the firm is using a fair algorithm. We show that, paradoxically, the lack of algorithmic accountability may lead to a divergence between consumer perceptions and the judicial view regarding the fairness of firm actions. Under certain conditions, a firm with a fair algorithm can be perceived by the population as less fair than a firm with an unfair algorithm. We also demonstrate how a watchdog institution may correct these misperceptions of bias
  • Access State: Open Access