JOURNAL ARTICLE

Compressive imaging for defending deep neural networks from adversarial attacks

Vladislav KravetsBahram JavidiAdrian Stern

Year: 2021 Journal:   Optics Letters Vol: 46 (8)Pages: 1951-1951   Publisher: Optica Publishing Group

Abstract

Despite their outstanding performance, convolutional deep neural networks (DNNs) are vulnerable to small adversarial perturbations. In this Letter, we introduce a novel approach to thwart adversarial attacks. We propose to employ compressive sensing (CS) to defend DNNs from adversarial attacks, and at the same time to encode the image, thus preventing counterattacks. We present computer simulations and optical experimental results of object classification in adversarial images captured with a CS single pixel camera.

Keywords:
Adversarial system Computer science Convolutional neural network Artificial intelligence Ghost imaging ENCODE Pixel Object (grammar) Deep neural networks Computer vision Deep learning Image (mathematics) Pattern recognition (psychology)

Metrics

14
Cited By
1.55
FWCI (Field Weighted Citation Impact)
23
Refs
0.85
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Adversarial Robustness in Machine Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Integrated Circuits and Semiconductor Failure Analysis
Physical Sciences →  Engineering →  Electrical and Electronic Engineering

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.