• Media type: E-Article
  • Title: Augmented reality-assisted gesture-based teleoperated system for robot motion planning
  • Contributor: Salman, Ahmed Eslam; Roman, Magdy Raouf
  • imprint: Emerald, 2023
  • Published in: Industrial Robot: the international journal of robotics research and application
  • Language: English
  • DOI: 10.1108/ir-11-2022-0289
  • ISSN: 0143-991X
  • Keywords: Industrial and Manufacturing Engineering ; Computer Science Applications ; Control and Systems Engineering
  • Origination:
  • Footnote:
  • Description: <jats:sec> <jats:title content-type="abstract-subheading">Purpose</jats:title> <jats:p>The study proposed a human–robot interaction (HRI) framework to enable operators to communicate remotely with robots in a simple and intuitive way. The study focused on the situation when operators with no programming skills have to accomplish teleoperated tasks dealing with randomly localized different-sized objects in an unstructured environment. The purpose of this study is to reduce stress on operators, increase accuracy and reduce the time of task accomplishment. The special application of the proposed system is in the radioactive isotope production factories. The following approach combined the reactivity of the operator’s direct control with the powerful tools of vision-based object classification and localization.</jats:p> </jats:sec> <jats:sec> <jats:title content-type="abstract-subheading">Design/methodology/approach</jats:title> <jats:p>Perceptive real-time gesture control predicated on a Kinect sensor is formulated by information fusion between human intuitiveness and an augmented reality-based vision algorithm. Objects are localized using a developed feature-based vision algorithm, where the homography is estimated and Perspective-n-Point problem is solved. The 3D object position and orientation are stored in the robot end-effector memory for the last mission adjusting and waiting for a gesture control signal to autonomously pick/place an object. Object classification process is done using a one-shot Siamese neural network (NN) to train a proposed deep NN; other well-known models are also used in a comparison. The system was contextualized in one of the nuclear industry applications: radioactive isotope production and its validation were performed through a user study where 10 participants of different backgrounds are involved.</jats:p> </jats:sec> <jats:sec> <jats:title content-type="abstract-subheading">Findings</jats:title> <jats:p>The system was contextualized in one of the nuclear industry applications: radioactive isotope production and its validation were performed through a user study where 10 participants of different backgrounds are involved. The results revealed the effectiveness of the proposed teleoperation system and demonstrate its potential for use by robotics non-experienced users to effectively accomplish remote robot tasks.</jats:p> </jats:sec> <jats:sec> <jats:title content-type="abstract-subheading">Social implications</jats:title> <jats:p>The proposed system reduces risk and increases level of safety when applied in hazardous environment such as the nuclear one.</jats:p> </jats:sec> <jats:sec> <jats:title content-type="abstract-subheading">Originality/value</jats:title> <jats:p>The contribution and uniqueness of the presented study are represented in the development of a well-integrated HRI system that can tackle the four aforementioned circumstances in an effective and user-friendly way. High operator–robot reactivity is kept by using the direct control method, while a lot of cognitive stress is removed using elective/flapped autonomous mode to manipulate randomly localized different configuration objects. This necessitates building an effective deep learning algorithm (in comparison to well-known methods) to recognize objects in different conditions: illumination levels, shadows and different postures.</jats:p> </jats:sec>