• Media type: E-Article
  • Title: A unified valence scale based on diagnosis of facial expressions
  • Contributor: Schmitz-Hübsch, Alina; Becker, Ron
  • Published: SAGE Publications, 2022
  • Published in: Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 66 (2022) 1, Seite 1056-1059
  • Language: English
  • DOI: 10.1177/1071181322661500
  • ISSN: 2169-5067; 1071-1813
  • Keywords: Development ; Geography, Planning and Development
  • Origination:
  • Footnote:
  • Description: <jats:p> Affect-adaptive systems detect the emotional user state, assess it against the current situation, and adjust interaction accordingly. Tools for real-time emotional state detection, like the Emotient FACET engine (Littlewort et al., 2011), are based on the analysis of facial expressions. When developing affect-adaptive systems, output from the diagnostic engine must be mapped onto theoretical models of emotion. The Circumplex Model of Affect (Russell, 1980) describes emotion on two dimensions: valence and arousal. However, FACET offers three classifiers for valence: positive, neutral, and negative valence. The present study aimed at developing an algorithm that converts these into a unified valence scale. We used FACET to analyze valence-labeled images from the AffectNet database. In a multiple regression analysis, FACET classifier values predicted database valence and explained 38% of the variance. By inserting classifier values into the regression equation, a unified valence scale can be calculated that matches dimensional models of emotion. This research forms the groundwork for adaptation of the emotional user state based on the FACET engine. A future affect-adaptive system can now use the FACET engine to detect the emotional user state on a unified valence dimension, which allows for distinct classification and interpretation of emotions. </jats:p>