JOURNAL ARTICLE

Physical Transferable Attack against Black-box Face Recognition Systems

Abstract

Recent studies have shown that machine learning models in general and deep neural networks like CNN, in particular, are vulnerable to adversarial attacks. Specifically, in terms of face recognition, one can easily deceive deep learning networks by adding a visually imperceptible adversarial perturbation to the input images. However, most of these works assume the ideal scenario where the attackers have perfect information about the victim model and the attack is performed in the digital domain, which is not a realistic assumption. As a result, these methods often poorly (or even impossible to) transfer to the real world. To address this issue, we propose a novel physical transferable attack method on deep face recognition systems that can work in real-world settings without any knowledge about the victim model. Our experiments on various state-of-the-art models with various architectures and training losses show non-trivial attack success rates. With the observed results, we believe that our method can enable further studies on improving adversarial robustness as well as security of deep face recognition systems.

Keywords:
Computer science Adversarial system Robustness (evolution) Facial recognition system Deep learning Artificial intelligence Deep neural networks Machine learning Face (sociological concept) Attack model Artificial neural network Transfer of learning Computer security Pattern recognition (psychology)

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
78
Refs
0.15
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Adversarial Robustness in Machine Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Face recognition and analysis
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.