JOURNAL ARTICLE

Semantic Understanding of Foggy Scenes with Purely Synthetic Data

Abstract

This work addresses the problem of semantic scene understanding under foggy\nroad conditions. Although marked progress has been made in semantic scene\nunderstanding over the recent years, it is mainly concentrated on clear weather\noutdoor scenes. Extending semantic segmentation methods to adverse weather\nconditions like fog is crucially important for outdoor applications such as\nself-driving cars. In this paper, we propose a novel method, which uses purely\nsynthetic data to improve the performance on unseen real-world foggy scenes\ncaptured in the streets of Zurich and its surroundings. Our results highlight\nthe potential and power of photo-realistic synthetic images for training and\nespecially fine-tuning deep neural nets. Our contributions are threefold, 1) we\ncreated a purely synthetic, high-quality foggy dataset of 25,000 unique outdoor\nscenes, that we call Foggy Synscapes and plan to release publicly 2) we show\nthat with this data we outperform previous approaches on real-world foggy test\ndata 3) we show that a combination of our data and previously used data can\neven further improve the performance on real-world foggy data.\n

Keywords:

Metrics

67
Cited By
3.31
FWCI (Field Weighted Citation Impact)
40
Refs
0.94
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Generative Adversarial Networks and Image Synthesis
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.