Treffer: Raw event-based adversarial attacks for Spiking Neural Networks with configurable latencies.

Title:
Raw event-based adversarial attacks for Spiking Neural Networks with configurable latencies.
Authors:
Du X; School of Mathematics, Jilin University, No. 2699 Qianjin Street, Changchun, 130012, Jilin, China. Electronic address: duxiao18@mails.jlu.edu.cn., Shi W; School of Artificial Intelligence, Jilin University, No. 2699 Qianjin Street, Changchun, 130012, Jilin, China. Electronic address: wanlishi2@gmail.com., Zhao X; Mohamed bin Zayed University of Artificial Intelligence, Building 1B, Masdar City, Abu Dhabi, United Arab Emirates. Electronic address: ritianzhaoxx@gmail.com., Cao Y; School of Mathematics, Jilin University, No. 2699 Qianjin Street, Changchun, 130012, Jilin, China. Electronic address: caoyang@jlu.edu.cn., Gu B; School of Artificial Intelligence, Jilin University, No. 2699 Qianjin Street, Changchun, 130012, Jilin, China. Electronic address: BinGu@jlu.edu.cn., Wu T; School of Artificial Intelligence, Jilin University, No. 2699 Qianjin Street, Changchun, 130012, Jilin, China. Electronic address: wutr@jlu.edu.cn.
Source:
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2026 Jan; Vol. 193, pp. 108026. Date of Electronic Publication: 2025 Aug 23.
Publication Type:
Journal Article
Language:
English
Journal Info:
Publisher: Pergamon Press Country of Publication: United States NLM ID: 8805018 Publication Model: Print-Electronic Cited Medium: Internet ISSN: 1879-2782 (Electronic) Linking ISSN: 08936080 NLM ISO Abbreviation: Neural Netw Subsets: MEDLINE
Imprint Name(s):
Original Publication: New York : Pergamon Press, [c1988-
Contributed Indexing:
Keywords: Deep learning; Event-based tasks; Machine learning; Spiking Neural Network; White-box adversarial attack
Entry Date(s):
Date Created: 20250903 Date Completed: 20251217 Latest Revision: 20251217
Update Code:
20251217
DOI:
10.1016/j.neunet.2025.108026
PMID:
40902271
Database:
MEDLINE

Weitere Informationen

Spiking Neural Networks (SNNs) and data from Dynamic Vision Sensors (DVSs) offer energy-efficient solutions for edge devices with limited battery life. The input latencies of event data to SNNs are critical for energy savings, and reducing these latencies through configurable parameters is essential. However, security concerns, particularly adversarial attacks on SNNs, are increasingly significant. While most existing research primarily focuses on attacking event sequences, which may not always be accessible, attacks on raw event streams with configurable latencies remain underexplored due to challenges such as extreme sparsity and the complexity of discrete optimization. This paper proposes a novel adversarial attack method on raw event streams with configurable latencies. To address sparsity and discrete optimization, we smooth the optimization by converting binary spikes into continuous values. Furthermore, we introduce an adaptively stochastic strategy for sampling attacking latencies. We apply regularization terms to maintain sparsity and ensure adversarial samples resemble raw event streams while approximating the target label. Extensive experiments on datasets such as N-MNIST, CIFAR10-DVS, N-Caltech-101, and Gesture-DVS demonstrate that our method consistently outperforms existing approaches, achieving higher attack success rates (ASR) across various latencies. Ablation studies validate the effectiveness of our contributions and highlight the impact of latency on the generation of adversarial samples.
(Copyright © 2025 Elsevier Ltd. All rights reserved.)

Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.