Gaze estimation has gained increasing attention due to its widespread applications. In real-world unconstrained environments, the performance is still unstable due to large variations of head posture and environmental conditions such as illumination changes. This paper proposes a novel appearancebased gaze estimation method by extracting multi-scale features to solve the problems of head pose changes and lighting effects. We demonstrate the effectiveness of our proposed method by conducting experiments on three popular gaze estimation datasets. Experimental results show that our method achieves the prediction errors of 3.47°, 10.57°, and 6.95° on the MPIIFaceGaze, Gaze360 and RT-GENE datasets, respectively.
Rawdha KarmiInes RahmanyNawrès Khlifa
Rawdha KarmiInes RahmanyNawrès Khlifa
Ce LiK. WeiShaolong RenFan HuangHangfei JiangJialin Ma
Samer JammalTammam TilloJimin Xiao