• Media type: E-Article
  • Title: Toward Software-Equivalent Accuracy on Transformer-Based Deep Neural Networks With Analog Memory Devices
  • Contributor: Spoon, Katie; Tsai, Hsinyu; Chen, An; Rasch, Malte J.; Ambrogio, Stefano; Mackin, Charles; Fasoli, Andrea; Friz, Alexander M.; Narayanan, Pritish; Stanisavljevic, Milos; Burr, Geoffrey W.
  • imprint: Frontiers Media SA, 2021
  • Published in: Frontiers in Computational Neuroscience
  • Language: Not determined
  • DOI: 10.3389/fncom.2021.675741
  • ISSN: 1662-5188
  • Keywords: Cellular and Molecular Neuroscience ; Neuroscience (miscellaneous)
  • Origination:
  • Footnote:
  • Description: <jats:p>Recent advances in deep learning have been driven by ever-increasing model sizes, with networks growing to millions or even billions of parameters. Such enormous models call for fast and energy-efficient hardware accelerators. We study the potential of Analog AI accelerators based on Non-Volatile Memory, in particular Phase Change Memory (PCM), for software-equivalent accurate inference of natural language processing applications. We demonstrate a path to software-equivalent accuracy for the GLUE benchmark on BERT (Bidirectional Encoder Representations from Transformers), by combining noise-aware training to combat inherent PCM drift and noise sources, together with reduced-precision digital attention-block computation down to INT6.</jats:p>
  • Access State: Open Access