• Medientyp: E-Artikel
  • Titel: Approaching the Real-World : Supporting Activity Recognition Training with Virtual IMU Data : Supporting Activity Recognition Training with Virtual IMU Data
  • Beteiligte: Kwon, Hyeokhyen; Wang, Bingyao; Abowd, Gregory D.; Plötz, Thomas
  • Erschienen: Association for Computing Machinery (ACM), 2021
  • Erschienen in: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
  • Sprache: Englisch
  • DOI: 10.1145/3478096
  • ISSN: 2474-9567
  • Schlagwörter: Computer Networks and Communications ; Hardware and Architecture ; Human-Computer Interaction
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: <jats:p>Recently, IMUTube introduced a paradigm change for bootstrapping human activity recognition (HAR) systems for wearables. The key idea is to utilize videos of activities to support training activity recognizers based on inertial measurement units (IMUs). This system retrieves video from public repositories and subsequently generates virtual IMU data from this. The ultimate vision for such a system is to make large amounts of weakly labeled videos accessible for model training in HAR and, as such, to overcome one of the most pressing issues in the field: the lack of significant amounts of labeled sample data. In this paper we present the first in-detail exploration of IMUTube in a realistic assessment scenario: the analysis of free-weight gym exercises. We make significant progress towards a flexible, fully-functional IMUTube system by extending it such that it can handle a range of artifacts that are common in unrestricted online videos, including various forms of video noise, non-human poses, body part occlusions, and extreme camera and human motion. By overcoming these real-world challenges, we are able to generate high-quality virtual IMU data, which allows us to employ IMUTube for practical analysis tasks. We show that HAR systems trained by incorporating virtual sensor data generated by IMUTube significantly outperform baseline models trained only with real IMU data. In doing so we demonstrate the practical utility of IMUTube and the progress made towards the final vision of the new bootstrapping paradigm.</jats:p>