Beschreibung:
<p>In an age when people spend most of their time indoors and smartphones become a necessity, there is an increasing demand to navigate user absolute position in indoor environments. While global positioning system (GPSs) perform well outdoors, their inaccuracy can not be tolerated in places where GPS signal is weak or barely detected. This leads to a number of solutions which utilize smartphone inertial measurement unit (IMU) to track user location. Most IMU-based methods track the trajectory of a person by using stride-length and heading estimation. Thus, the accuracy of stride-length estimation plays a very important role in these methods. Inspired by recent success in the field of computer vision and machine learning, we proposed an image-based stride-length estimation method that employs gramian angular field (GAF) in converting accelerometer data into images, and then feed them into a convolutional neural network (CNN) to predict the stride-length. We evaluate the performance of our proposed method by using a public dataset from Qu Wang in his GitHub repository (available at https://github.com/Archeries/StrideLengthEstimation). The result shows that our proposed method is superior in terms of accuracy in one stride and in large walking distance than others using only data collected from the accelerometer.</p>