Result: Masked Spikformer: Gaussian based and Random Spike Masking for Energy-Efficient Spiking Transformers

Title:
Masked Spikformer: Gaussian based and Random Spike Masking for Energy-Efficient Spiking Transformers
Contributors:
Fleury, Anthony
Publisher Information:
2025.
Publication Year:
2025
Document Type:
Conference Conference object
File Description:
application/pdf
Language:
English
Rights:
CC BY NC
Accession Number:
edsair.od......4254..6312b9f76b3da63b63ebabd5e61daf99
Database:
OpenAIRE

Further Information

Spiking Neural Networks (SNNs) are increasingly explored for their energy efficiency and biological plausibility, offering a compelling alternative to traditional artificial neural networks in neuromorphic applications. However, even Spikformer a fully spiking adaptation of the Transformer model, can exhibit significant computational redundancy due to excessive spike activity, resulting in non-negligible energy consumption. In this paper, we introduce Masked Spikformer, a unified framework for regulating temporal spike activity in spiking Transformers through three complementary masking strategies: Random Spike Masking (RSM), Gaussian-Based Spike Masking (GSM), and Gaussian-Based Spike Weighting (GSW). These approaches encompass both binary masking and continuous, learnable temporal weighting. Our method is integrated into fully spiking architectures and applied consistently during both training and inference. Experimental results on neuromorphic benchmarks (CIFAR10-DVS and DVS Gesture) show that the proposed masking strategies significantly reduce energy consumption while maintaining high classification performance. Notably, the best accuracy on DVS Gesture is achieved by the RSM variant with 70% masking, while GSW with centered Gaussian weighting attains the highest accuracy on CIFAR10-DVS. The RSM variant also provides the lowest energy consumption across both datasets, highlighting the effectiveness of temporal sparsity for energyefficient spiking Transformers.