Energy-Aware Spike Budgeting for Continual Learning in Spiking Neural Networks for Neuromorphic Vision
2026-02-12 • Neural and Evolutionary Computing
Neural and Evolutionary ComputingArtificial IntelligenceComputer Vision and Pattern Recognition
AI summaryⓘ
The authors address a challenge in neuromorphic vision systems, which use brain-inspired spiking neural networks for low-power image processing but struggle to learn new things without forgetting old ones. They developed a new method that manages how many spikes (signals) the network uses during training to balance accuracy and energy use, adapting to different types of image data. Their system improves performance on both regular and event-based camera datasets while using less power. This work helps make continual learning more practical for smart vision devices that need to efficiently process changing environments.
Spiking Neural NetworksNeuromorphic VisionContinual LearningCatastrophic ForgettingEnergy EfficiencyExperience ReplayLeaky Integrate-and-Fire NeuronSpike SchedulingEvent-based CamerasFrame-based Cameras
Authors
Anika Tabassum Meem, Muntasir Hossain Nadid, Md Zesun Ahmed Mia
Abstract
Neuromorphic vision systems based on spiking neural networks (SNNs) offer ultra-low-power perception for event-based and frame-based cameras, yet catastrophic forgetting remains a critical barrier to deployment in continually evolving environments. Existing continual learning methods, developed primarily for artificial neural networks, seldom jointly optimize accuracy and energy efficiency, with particularly limited exploration on event-based datasets. We propose an energy-aware spike budgeting framework for continual SNN learning that integrates experience replay, learnable leaky integrate-and-fire neuron parameters, and an adaptive spike scheduler to enforce dataset-specific energy constraints during training. Our approach exhibits modality-dependent behavior: on frame-based datasets (MNIST, CIFAR-10), spike budgeting acts as a sparsity-inducing regularizer, improving accuracy while reducing spike rates by up to 47\%; on event-based datasets (DVS-Gesture, N-MNIST, CIFAR-10-DVS), controlled budget relaxation enables accuracy gains up to 17.45 percentage points with minimal computational overhead. Across five benchmarks spanning both modalities, our method demonstrates consistent performance improvements while minimizing dynamic power consumption, advancing the practical viability of continual learning in neuromorphic vision systems.