• Medientyp: E-Artikel
  • Titel: Robust Inertial Motion Tracking through Deep Sensor Fusion across Smart Earbuds and Smartphone
  • Beteiligte: Gong, Jian; Zhang, Xinyu; Huang, Yuanjun; Ren, Ju; Zhang, Yaoxue
  • Erschienen: Association for Computing Machinery (ACM), 2021
  • Erschienen in: Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 5 (2021) 2, Seite 1-26
  • Sprache: Englisch
  • DOI: 10.1145/3463517
  • ISSN: 2474-9567
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: IMU based inertial tracking plays an indispensable role in many mobility centric tasks, such as robotic control, indoor navigation and virtual reality gaming. Despite its mature application in rigid machine mobility (e.g., robot and aircraft), tracking human users via mobile devices remains a fundamental challenge due to the intractable gait/posture patterns. Recent data-driven models have tackled sensor drifting, one key issue that plagues inertial tracking. However, these systems still assume the devices are held or attached to the user body with a relatively fixed posture. In practice, natural body activities may rotate/translate the device which may be mistaken as whole body movement. Such motion artifacts remain as the dominating factor that fails existing inertial tracing systems in practical uncontrolled settings. Inspired by the observation that human heads induces far less intensive movement relative to the body during walking, compared to other parts, we propose a novel multi-stage sensor fusion pipeline called DeepIT, which realizes inertial tracking by synthesizing the IMU measurements from a smartphone and an associated earbud. DeepIT introduces a data-driven reliability aware attention model, which assesses the reliability of each IMU and opportunistically synthesizes their data to mitigate the impacts of motion noise. Furthermore, DeepIT uses a reliability aware magnetometer compensation scheme to combat the angular drifting problem caused by unrestricted motion artifacts. We validate DeepIT on the first large-scale inertial navigation dataset involving both smartphone and earbud IMUs. The evaluation results show that DeepIT achieves multiple folds of accuracy improvement on the challenging uncontrolled natural walking scenarios, compared with state-of-the-art closed-form and data-driven models.