Abstract

Multiple target tracking is well known computer vision problem due to its wide range of application in industrial and biomedical domains. Significant amount of research has been done to tackle the problem, but still it is challenging to design robust tracking system that can provide solutions for many real-world tracking applications. In this paper, we present two-stage approach for tracking multiple target trajectories in video sequences. In the first stage, machine learning is utilized to predict the type of the trajectory on which a target is travelling. We selected Bidirectional Long Short-Term Memory Recurrent Neural Network to classify trajectories generated by the targets. The network is beneficial for learning bidirectional long-term dependencies between time steps of sequential data. In the second stage, polynomial curve (arc) of 2 nd degree is being fit to the location data of previous six video frames and polynomial curve (arc) of 3 rd degree is being fit to the velocity curve computed from the six previous frames. This information helps to extrapolate the future velocity of the target and its future location. We demonstrate the tracking abilities of our new approach and estimate the error of prediction on simulated trajectory data and on real-world videos.

Keywords:
Tracking (education) Trajectory Computer science Artificial intelligence Artificial neural network Polynomial Term (time) Degree (music) Computer vision Key (lock) Mathematics

Metrics

5
Cited By
0.21
FWCI (Field Weighted Citation Impact)
21
Refs
0.57
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Anomaly Detection Techniques and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.