JOURNAL ARTICLE

Assessment of Smartphone GNSS Measurements in Tightly Coupled Visual Inertial Navigation

Mehmet Fikret OcalMurat DurmazEngin TunalıHasan Yıldız

Year: 2025 Journal:   Applied Sciences Vol: 15 (23)Pages: 12796-12796   Publisher: Multidisciplinary Digital Publishing Institute

Abstract

Precise, seamless, and high-rate navigation remains a major challenge, particularly when relying on low-cost sensors. With the decreasing cost of cameras, Inertial Measurement Units (IMUs), and Global Navigation Satellite System (GNSS) receivers, tightly coupled fusion frameworks, such as GVINS, have gained considerable attention. GVINS is an optimization-based factor-graph framework that integrates visual and inertial measurements with single-frequency GNSS-code pseudorange observations to provide robust and drift-free navigation. This study aimed to evaluate the potential of applying GVINS to low-cost, low-power, and single-frequency GNSS receivers, particularly those embedded in smartphones, by integrating 1 Hz GNSS measurements collected in three challenging urban scenarios into the GVINS framework to produce seamless 10 Hz positioning estimates. The experiments were conducted using an Xsens MTi-1 IMU and global-shutter (GS) cameras, as well as a Samsung A51 smartphone and a u-blox ZED-F9P GNSS receiver. GVINS was modified to process 1 Hz GNSS measurements. Differential corrections from a nearby GNSS reference station were also incorporated to assess their impact on optimization-based filters, such as GVINS. The performance of GVINS and Differential GVINS (D-GVINS) solutions using smartphone measurements was compared against standard point positioning (SPP) and differential GPS (DGPS) results obtained from the same smartphone GNSS receiver, as well as the GVINS solution derived from u-blox ZED-F9P measurements sampled at 1 Hz. Experimental results show that GVINS effectively operates with smartphone GNSS measurements, reducing 3D RMS errors by 80.4%, 64.9%, and 83.8% for the sports field, campus-walking, and campus-driving datasets, respectively, when differential corrections are applied relative to the SPP solution. These results highlight the potential of smartphone GNSS receivers within the GVINS framework: Even though they observe fewer constellations, lower signal quality, and a lower number of satellites, they can still achieve a performance comparable to that of a relatively higher-end dual-frequency GNSS receiver, the u-blox ZED-F9P. Further studies will focus on adapting the GVINS algorithm to run directly on smartphones to utilize all the available measurements, including the camera, IMU, barometer, magnetometer, and additional ranging sensors.

Keywords:

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
35
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Related Documents

JOURNAL ARTICLE

Tightly-coupled GNSS-aided Visual-Inertial Localization

Woosik LeePatrick GenevaYulin YangGuoquan Huang

Journal:   2022 International Conference on Robotics and Automation (ICRA) Year: 2022 Pages: 9484-9491
JOURNAL ARTICLE

GRVINS: Tightly Coupled GNSS-Range-Visual-Inertial System

Bingxian LuYu-Chung TsaiKuo-Shih Tseng

Journal:   Journal of Intelligent & Robotic Systems Year: 2024 Vol: 110 (1)
JOURNAL ARTICLE

Nonlinear Observer for Tightly Coupled Integrated Inertial Navigation Aided by RTK-GNSS Measurements

Jakob M. HansenTor Arne JohansenNadezda SokolovaThor I. Fossen

Journal:   IEEE Transactions on Control Systems Technology Year: 2018 Vol: 27 (3)Pages: 1084-1099
JOURNAL ARTICLE

Accurate and Capable GNSS-Inertial-Visual Vehicle Navigation via Tightly Coupled Multiple Homogeneous Sensors

Zhiheng ShenXingxing LiYuxuan ZhouShengyu LiZongzhou WuXuanbin Wang

Journal:   IEEE Transactions on Automation Science and Engineering Year: 2024 Vol: 22 Pages: 5464-5478
JOURNAL ARTICLE

A Real-Time, Robust Visual-Inertial Navigation System Tightly Coupled With GNSS and Barometer

Yifan CheJiuxiang Dong

Journal:   IEEE Sensors Letters Year: 2024 Vol: 8 (6)Pages: 1-4
© 2026 ScienceGate Book Chapters — All rights reserved.