Skip to yearly menu bar Skip to main content


Poster

SpiLiFormer: Enhancing Spiking Transformers with Lateral Inhibition

Zeqi Zheng · Yanchen Huang · Yingchao Yu · Zizheng Zhu · Junfeng Tang · Zhaofei Yu · Yaochu Jin


Abstract:

Spiking Neural Networks (SNNs) based on Transformers have garnered significant attention due to their superior performance and high energy efficiency. However, the spiking attention modules of most existing Transformer-based SNNs are adapted from those of analog Transformers, failing to fully address the issue of over-allocating attention to irrelevant contexts. To fix this fundamental yet overlooked issue, we propose a Lateral Inhibition-inspired Spiking Transformer (SpiLiFormer). It emulates the brain's lateral inhibition mechanism, guiding the model to enhance attention to relevant tokens while suppressing attention to irrelevant ones. Our model achieves state-of-the-art (SOTA) performance across multiple datasets, including CIFAR-10 (+0.45\%), CIFAR-100 (+0.48\%), CIFAR10-DVS (+2.70\%), N-Caltech101 (+1.94\%), and ImageNet-1K (+1.6\%). Notably, on the ImageNet-1K dataset, SpiLiFormer (69.9M parameters, 4 time steps, 384 resolution) outperforms E-SpikeFormer (173.0M parameters, 8 time steps, 384 resolution), a SOTA spiking Transformer, by 0.46\% using only 39\% of the parameters and half the time steps. Our code and training checkpoints will be released upon acceptance.

Live content is unavailable. Log in and register to view live content