JOURNAL ARTICLE

Early Stage Flame Segmentation with Deep Learning and Intel's OpenVINO Toolkit

Abstract

With the advancements of Artificial Intelligence (AI), Machine Learning (ML), and Deep Learning (DL), it is now possible to greatly speed up the processes of predicting certain anomalies and prevent unforeseen situations and disasters. One example of such an environmental disaster is the problem of early-stage flame segmentation. It is not only important to create a model capable of pattern recognition with high accuracy but also to optimize it for real-time execution. In this paper, we demonstrate the capabilities of Deeplabv3+ for early-stage flame segmentation on a custom-made dataset with challenging conditions, and near real-time execution with the adoption of the Open VINO toolkit. Acceleration of the inference process in the range of 70.46% to 93.46% is achieved, while the speed of the inference process when using the GPU with FP16 precision is increased by almost 2 times when compared to FP32 precision. The impact of our findings is significant, as early-stage flame segmentation is a critical component of disaster prevention in environmental settings. Our results demonstrate the potential of using the OpenVINO toolkit for the acceleration of the inference process.

Keywords:
Inference Computer science Segmentation Artificial intelligence Process (computing) Stage (stratigraphy) Acceleration Deep learning Component (thermodynamics) Machine learning Execution time Computer engineering

Metrics

2
Cited By
0.55
FWCI (Field Weighted Citation Impact)
13
Refs
0.59
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Fire Detection and Safety Systems
Physical Sciences →  Engineering →  Safety, Risk, Reliability and Quality
Fire dynamics and safety research
Physical Sciences →  Engineering →  Safety, Risk, Reliability and Quality
Fire effects on ecosystems
Physical Sciences →  Environmental Science →  Global and Planetary Change
© 2026 ScienceGate Book Chapters — All rights reserved.