UAVs require robust visual perception under rapid motion and varying illumination, conditions for which event cameras are well-suited. Such cameras asynchronously capture brightness changes through ON and OFF events, providing an efficient modality for visual data processing. Bio-inspired Spiking Neural Networks (SNNs), utilizing spike-driven neuron models, are highly suitable for processing such data due to their ability to encode temporal dynamics. However, the inherent imbalance between ON and OFF events poses challenges, limiting the performance of event-based algorithms. This paper introduces an adaptive representation learning method that models event streams as temporal point processes and dynamically balances synaptic plasticity between ON and OFF events. The proposed method ensures that learning is equally driven by both event polarities, effectively mitigating the imbalance issue and enhancing the robustness of SNNs. Experimental results on benchmark datasets, including N-CARS, N-CALTECH101, DVS-CIFAR10, and CEP-DVS, demonstrate improvements in SNN classification accuracy. These findings underscore the effectiveness of the proposed adaptive bio-inspired approach, offering new possibilities for visual data analysis and neuromorphic computing.
Amirhossein TavanaeiTimothée MasquelierAnthony S. Maida
Łukasz KoryckiBartosz Krawczyk
Yanting LiShuai WangJunwei JinFubao ZhuLiang ZhaoJing LiangC. L. Philip Chen
Alberto FernándezSalvador GarcíaMikel GalarRonaldo C. PratiBartosz KrawczykFrancisco Herrera
Jiahang TuXijia TangShilin GuYucong DaiRuidong FanChenping Hou