• Media type: E-Article
  • Title: 3D Gaze Estimation Using RGB-IR Cameras
  • Contributor: Mokatren, Moayad; Kuflik, Tsvi; Shimshoni, Ilan
  • imprint: MDPI AG, 2022
  • Published in: Sensors
  • Language: English
  • DOI: 10.3390/s23010381
  • ISSN: 1424-8220
  • Keywords: Electrical and Electronic Engineering ; Biochemistry ; Instrumentation ; Atomic and Molecular Physics, and Optics ; Analytical Chemistry
  • Origination:
  • Footnote:
  • Description: <jats:p>In this paper, we present a framework for 3D gaze estimation intended to identify the user’s focus of attention in a corneal imaging system. The framework uses a headset that consists of three cameras, a scene camera and two eye cameras: an IR camera and an RGB camera. The IR camera is used to continuously and reliably track the pupil and the RGB camera is used to acquire corneal images of the same eye. Deep learning algorithms are trained to detect the pupil in IR and RGB images and to compute a per user 3D model of the eye in real time. Once the 3D model is built, the 3D gaze direction is computed starting from the eyeball center and passing through the pupil center to the outside world. This model can also be used to transform the pupil position detected in the IR image into its corresponding position in the RGB image and to detect the gaze direction in the corneal image. This technique circumvents the problem of pupil detection in RGB images, which is especially difficult and unreliable when the scene is reflected in the corneal images. In our approach, the auto-calibration process is transparent and unobtrusive. Users do not have to be instructed to look at specific objects to calibrate the eye tracker. They need only to act and gaze normally. The framework was evaluated in a user study in realistic settings and the results are promising. It achieved a very low 3D gaze error (2.12°) and very high accuracy in acquiring corneal images (intersection over union—IoU = 0.71). The framework may be used in a variety of real-world mobile scenarios (indoors, indoors near windows and outdoors) with high accuracy.</jats:p>
  • Access State: Open Access