Skip to content

LMBTough/FSPS

Repository files navigation

Improving Adversarial Transferability via Frequency-based Stationary Point Search

License: MIT Venue:CIKM 2023

This Github repository is the paper Improving Adversarial Transferability via Frequency-based Stationary Point Search implementation code.

Image 1
Figure 1: Frequency-based attack visualization results for different models.

Abstract

Deep neural networks (DNNs) have been shown vulnerable to interference from adversarial samples, leading to erroneous predictions. Investigating adversarial attacks can effectively improve the reliability as well as the performance of deep neural models in real-world applications. Since it is generally challenging to infer the parameters in black-box models, high transferability becomes an important factor for the success rate of an attack method. Recently, the Spectrum Simulation Attack method exhibits promising results based on the frequency domain. In light of SSA, we propose a novel attack approach in this paper, which achieves the best results among diverse state-of-the-art transferable adversarial attack methods. Our method aims to find a stationary point, which extends the ability to find multiple local optima with the optimal local attack effect. After finding the stationary point, a frequency-based search is employed to explore the best adversarial samples in the neighbouring space, utilmately determining the final adversarial direction. We compare our method against a variety of cutting-edge transferable adversarial methods. Extensive experiments validate that our method improves the attack success rate by 4.7% for conventionally trained models and 53.1% for adversarially trained models.

Installation and Setup

Requirements

  • python 3.8.10
  • pytorch 1.13.1
  • torchvision 0.14.1
  • pretrainedmodels 0.7.0
  • numpy 1.21.3
  • tqdm 4.63.1

Setup

The implementation necessitates the acquisition of pretrained models, integral to evaluating the adversarial robustness and transferability. Download and place the following models within the models directory:

Model Download Link
Inception V3 tf2torch_inception_v3
Inception V4 tf2torch_inception_v4
Inception-ResNet-v2 tf2torch_resnet_v2_152
ResNet V2 152 tf2torch_resnet_v2_152
Inception v3 adv tf2torch_adv_inception_v3
Inception ResNet v2 adv adv_inception_resnet_v2_2017_12_18.tar.gz
Inception v3 adv ens3 tf2torch_ens3_adv_inc_v3
Inception v3 adv ens4 tf2torch_ens4_adv_inc_v3
Inception ResNet v2 adv ens3 tf2torch_ens_adv_inc_res_v2

The models in the table above are from here. These models need to be downloaded and placed under the models dir.

Execution Guidelines

To execute the FSPS attack methodology and assess its efficacy in generating transferable adversarial examples, utilize the following command structure:

  • FSPS
CUDA_VISIBLE_DEVICES=0 python attack-FSPS.py --output_dir outputs_temp --method TI --num_images 1000 --model inceptionv3
CUDA_VISIBLE_DEVICES=0 python attack-FSPS.py --output_dir outputs_temp --method TI --num_images 1000 --model inceptionv4
CUDA_VISIBLE_DEVICES=0 python attack-FSPS.py --output_dir outputs_temp --method TI --num_images 1000 --model inceptionresnetv2
CUDA_VISIBLE_DEVICES=0 python attack-FSPS.py --output_dir outputs_temp --method TI --num_images 1000 --model resnet152
  • SSA
CUDA_VISIBLE_DEVICES=0 python attack-SSA.py --output_dir outputs_temp --method DITIMI --num_images 1000 --model inceptionv3
  • Baseline
CUDA_VISIBLE_DEVICES=0 python attack-baseline.py --output_dir outputs_temp --method DI --num_images 1000 --model inceptionv3
  • verify
CUDA_VISIBLE_DEVICES=0 python verify.py --method baseline_result_DI-v3 --output_dir outputs_temp/ --num_images 1000 --output_csv result.csv

Citing FSPS

If you utilize this implementation or the FSPS methodology in your research, please cite the following paper:

@inproceedings{zhu2023improving,
  title={Improving adversarial transferability via frequency-based stationary point search},
  author={Zhu, Zhiyu and Chen, Huaming and Zhang, Jiayu and Wang, Xinyi and Jin, Zhibo and Lu, Qinghua and Shen, Jun and Choo, Kim-Kwang Raymond},
  booktitle={Proceedings of the 32nd ACM International Conference on Information and Knowledge Management},
  pages={3626--3635},
  year={2023}
}

Acknowledgments

We extend our gratitude to the contributors and researchers whose insights and efforts have been instrumental in the development of the FSPS methodology.

For further information or inquiries, please refer to the corresponding author(s) of the FSPS paper or initiate a discussion in this repository's Issues section.

Reference

Code refer to: SSA

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages