• Media type: Conference Proceedings; E-Article
  • Title: Recurrent Spatial Attention for Facial Emotion Recognition
  • Contributor: Forch, Valentin [Author]; Vitay, Julien [Author]; Hamker, Fred H. [Author]
  • imprint: Chemnitz: Technische Universität Chemnitz, [2020]
  • Published in: Chemnitzer Linux-Tage 2019 - LocalizeIT Workshop
  • Language: English
  • Keywords: emotion recognition ; LSTM ; Deep Learning ; attention
  • Origination:
  • Footnote:
  • Description: Automatic processing of emotion information through deep neural networks (DNN) can have great benefits (e.g., for human-machine interaction). Vice versa, machine learning can profit from concepts known from human information processing (e.g., visual attention). We employed a recurrent DNN incorporating a spatial attention mechanism for facial emotion recognition (FER) and compared the output of the network with results from human experiments. The attention mechanism enabled the network to select relevant face regions to achieve state-of-the-art performance on a FER database containing images from realistic settings. A visual search strategy showing some similarities withhuman saccading behavior emerged when the model’s perceptive capabilities were restricted. However, the model then failed to form a useful scene representation.
  • Access State: Open Access
  • Rights information: Attribution - Share Alike (CC BY-SA)