• Media type: E-Article
  • Title: Fuzzy adaptive extended Kalman filter for robot 3D pose estimation
  • Contributor: Deilamsalehy, Hanieh; Havens, Timothy C.
  • imprint: Emerald, 2018
  • Published in: International Journal of Intelligent Unmanned Systems
  • Language: English
  • DOI: 10.1108/ijius-12-2017-0014
  • ISSN: 2049-6427
  • Keywords: General Engineering
  • Origination:
  • Footnote:
  • Description: <jats:sec> <jats:title content-type="abstract-subheading">Purpose</jats:title> <jats:p>Estimating the pose – position and orientation – of a moving object such as a robot is a necessary task for many applications, e.g., robot navigation control, environment mapping, and medical applications such as robotic surgery. The purpose of this paper is to introduce a novel method to fuse the information from several available sensors in order to improve the estimated pose from any individual sensor and calculate a more accurate pose for the moving platform.</jats:p> </jats:sec> <jats:sec> <jats:title content-type="abstract-subheading">Design/methodology/approach</jats:title> <jats:p>Pose estimation is usually done by collecting the data obtained from several sensors mounted on the object/platform and fusing the acquired information. Assuming that the robot is moving in a three-dimensional (3D) world, its location is completely defined by six degrees of freedom (6DOF): three angles and three position coordinates. Some 3D sensors, such as IMUs and cameras, have been widely used for 3D localization. Yet, there are other sensors, like 2D Light Detection And Ranging (LiDAR), which can give a very precise estimation in a 2D plane but they are not employed for 3D estimation since the sensor is unable to obtain the full 6DOF. However, in some applications there is a considerable amount of time in which the robot is almost moving on a plane during the time interval between two sensor readings; e.g., a ground vehicle moving on a flat surface or a drone flying at an almost constant altitude to collect visual data. In this paper a novel method using a “fuzzy inference system” is proposed that employs a 2D LiDAR in a 3D localization algorithm in order to improve the pose estimation accuracy.</jats:p> </jats:sec> <jats:sec> <jats:title content-type="abstract-subheading">Findings</jats:title> <jats:p>The method determines the trajectory of the robot and the sensor reliability between two readings and based on this information defines the weight of the 2D sensor in the final fused pose by adjusting “extended Kalman filter” parameters. Simulation and real world experiments show that the pose estimation error can be significantly decreased using the proposed method.</jats:p> </jats:sec> <jats:sec> <jats:title content-type="abstract-subheading">Originality/value</jats:title> <jats:p>To the best of the authors’ knowledge this is the first time that a 2D LiDAR has been employed to improve the 3D pose estimation in an unknown environment without any previous knowledge. Simulation and real world experiments show that the pose estimation error can be significantly decreased using the proposed method.</jats:p> </jats:sec>