Counting 3,742 Big Data & Machine Learning Frameworks, Toolsets, and Examples...
Suggestion? Feedback? Tweet @stkim1

Author
Last Commit
Apr. 12, 2019
Created
Nov. 16, 2018

Procedural Adversarial Examples

This repository contains sample code and an interactive Jupyter notebook for the paper "Procedural Noise Adversarial Examples for Black-Box Attacks on Deep Neural Networks".

Procedural noise functions are used to generate textures in computer graphics. In this work we present various types of procedural noise to generate adversarial perturbations against popular deep neural network architectures trained on the ImageNet image classification task.

The results show that adversarial examples can be generated using procedural noise without any knowledge of the target classifier. This demonstrates the instability of current neural networks to procedural noise patterns.

We encourage you to explore our Python notebooks and make your own adversarial examples! For the best experience, we recommend using a GPU:

  1. intro_gabor: A brief introduction to Gabor noise. slider

  2. slider_gabor, slider_perlin: Visualize and interactively play with the parameters. See the other slider notebooks for different procedural noise functions. slider

  3. intro_bopt: See how Bayesian optimization can automatically find parameters to fool a classifier on an image.

See our paper for more details: "Procedural Noise Adversarial Examples for Black-Box Attacks on Deep Neural Networks." Kenneth T. Co, Luis Muñoz-González, Emil C. Lupu. arXiv 2019.

Python Dependencies

Acknowledgments

Learn more about the Resilient Information Systems Security (RISS) group at Imperial College London. The main author is a PhD student supported by DataSpartan. DataSpartan is not affiliated with the university.

Please cite this paper if you use the code in this repository as part of a published research project.

@article{co2019procedural,
  title={Procedural Noise Adversarial Examples for Black-Box Attacks on Deep Neural Networks},
  author={Co, Kenneth T and Mu{\~n}oz-Gonz{\'a}lez, Luis and Lupu, Emil C},
  journal={arXiv preprint arXiv:1810.00470},
  year={2019}
}

This project is licensed under the MIT License, see the LICENSE.md file for details.