JOURNAL ARTICLE

Meta-Attack: Class-agnostic and Model-agnostic Physical Adversarial Attack

Weiwei FengBaoyuan WuTianzhu ZhangYong ZhangYongdong Zhang

Year: 2021 Journal:   2021 IEEE/CVF International Conference on Computer Vision (ICCV) Pages: 7767-7776

Abstract

Modern deep neural networks are often vulnerable to adversarial examples. Most exist attack methods focus on crafting adversarial examples in the digital domain, while only limited works study physical adversarial attack. However, it is more challenging to generate effective adversarial examples in the physical world due to many uncontrollable physical dynamics. Most current physical attack methods aim to generate robust physical adversarial examples by simulating all possible physical dynamics. When attacking new images or new DNN models, they require expensive manually efforts for simulating physical dynamics and considerable time for iteratively optimizing for each image. To tackle these issues, we propose a class-agnostic and model-agnostic physical adversarial attack model (Meta-Attack), which is able to not only generate robust physical adversarial examples by simulating color and shape distortions, but also generalize to attacking novel images and novel DNN models by accessing a few digital and physical images. To the best of our knowledge, this is the first work to formulate the physical attack as a few-shot learning problem. Here, the training task is redefined as the composition of a support set, a query set, and a target DNN model. Under the few-shot setting, we design a novel class-agnostic and model-agnostic meta-learning algorithm to enhance the generalization ability of our method. Extensive experimental results on two benchmark datasets with four challenging experimental settings verify the superior robustness and generalization of our method by comparing to state-of-the-art physical attack methods.

Keywords:
Adversarial system Computer science Robustness (evolution) Artificial intelligence Generalization Set (abstract data type) Class (philosophy) Deep learning Machine learning Cyber-physical system Benchmark (surveying) Mathematics

Metrics

23
Cited By
1.84
FWCI (Field Weighted Citation Impact)
73
Refs
0.89
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Adversarial Robustness in Machine Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Anomaly Detection Techniques and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Malware Detection Techniques
Physical Sciences →  Computer Science →  Signal Processing

Related Documents

JOURNAL ARTICLE

DAMAD: Database, Attack, and Model Agnostic Adversarial Perturbation Detector

Akshay AgarwalGaurav GoswamiMayank VatsaRicha SinghNalini Ratha

Journal:   IEEE Transactions on Neural Networks and Learning Systems Year: 2021 Vol: 33 (8)Pages: 3277-3289
JOURNAL ARTICLE

Task and Model Agnostic Adversarial Attack on Graph Neural Networks

Kartik SharmaSamidha VermaSourav MedyaArnab BhattacharyaSayan Ranu

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2023 Vol: 37 (12)Pages: 15091-15099
JOURNAL ARTICLE

Language Model Agnostic Gray-Box Adversarial Attack on Image Captioning

Nayyer AafaqNaveed AkhtarWei LiuMubarak ShahAjmal Mian

Journal:   IEEE Transactions on Information Forensics and Security Year: 2022 Vol: 18 Pages: 626-638
© 2026 ScienceGate Book Chapters — All rights reserved.