JOURNAL ARTICLE

Event-based Object Detection with Lightweight Spatial Attention Mechanism

Abstract

Event camera conveys dynamic visual information in the format of asynchronous digital events, resulting to the disability of detectors developed for RGB images. Previous methods of event-based object detection mainly rely on simple template matching and encoded maps with deep learning, which sacrifices the spatial sparsity of events and achieves a weak performance in the noisy environment. This paper proposes a miniature event-based spatial attention mechanism of the one-stage detector to reduce the noise of events and to enrich the multi-scale feature maps by merging the shallow features. Maintaining the sparse property of events to the maximum degree, this paper transplants the model from convolution neural network to sparse convolution network and trains it in two ways (by its own and with knowledge distillation). Results show that the lightweight spatial attention mechanism is compatible with one-stage detectors and convolution neural network outperforms sparse convolution network in the event-based object detection.

Keywords:
Computer science Artificial intelligence Convolution (computer science) Event (particle physics) Convolutional neural network Computer vision Asynchronous communication Object detection Feature (linguistics) Noise (video) Property (philosophy) Pattern recognition (psychology) Detector Feature extraction Object (grammar) Artificial neural network Image (mathematics)

Metrics

8
Cited By
0.64
FWCI (Field Weighted Citation Impact)
39
Refs
0.70
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Memory and Neural Computing
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
CCD and CMOS Imaging Sensors
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.