JOURNAL ARTICLE

Interpretable Goal-Based model for Vehicle Trajectory Prediction in Interactive Scenarios

Abstract

The abilities to understand the social interaction behaviors between a vehicle and its surroundings while predicting its trajectory in an urban environment are critical for road safety in autonomous driving. Social interactions are hard to explain because of their uncertainty. In recent years, neural network-based methods have been widely used for trajectory prediction and have been shown to outperform hand-crafted methods. However, these methods suffer from their lack of interpretability. In order to overcome this limitation, we combine the interpretability of a discrete choice model with the high accuracy of a neural network-based model for the task of vehicle trajectory prediction in an interactive environment. We implement and evaluate our model using the INTERACTION dataset and demonstrate the effectiveness of our proposed architecture to explain its predictions without compromising the accuracy.

Keywords:
Interpretability Trajectory Computer science Machine learning Artificial intelligence Task (project management) Artificial neural network Engineering

Metrics

6
Cited By
0.98
FWCI (Field Weighted Citation Impact)
32
Refs
0.69
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Autonomous Vehicle Technology and Safety
Physical Sciences →  Engineering →  Automotive Engineering
Traffic and Road Safety
Physical Sciences →  Engineering →  Safety, Risk, Reliability and Quality
Traffic control and management
Physical Sciences →  Engineering →  Control and Systems Engineering

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.