JOURNAL ARTICLE

Gaze estimation based on head movements in virtual reality applications using deep learning

Abstract

Gaze detection in Virtual Reality systems is mostly performed using eye-tracking devices. The coordinates of the sight, as well as other data regarding the eyes, are used as input values for the applications. While this trend is becoming more and more popular in the interaction design of immersive systems, most visors do not come with an embedded eye-tracker, especially those that are low cost and maybe based on mobile phones. We suggest implementing an innovative gaze estimation system into virtual environments as a source of information regarding users intentions. We propose a solution based on a combination of the features of the images and the movement of the head as an input of a Deep Convolutional Neural Network capable of inferring the 2D gaze coordinates in the imaging plane.

Keywords:
Gaze Computer science Virtual reality Artificial intelligence Head (geology) Computer vision Estimation Optical head-mounted display Augmented reality Human–computer interaction Computer graphics (images) Geology Engineering

Metrics

17
Cited By
1.58
FWCI (Field Weighted Citation Impact)
4
Refs
0.82
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Gaze Tracking and Assistive Technology
Physical Sciences →  Computer Science →  Human-Computer Interaction
Hand Gesture Recognition Systems
Physical Sciences →  Computer Science →  Human-Computer Interaction
Visual Attention and Saliency Detection
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.