GithubHelp home page GithubHelp logo

andy0731 / diffusion_denoised_smoothing Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ethz-spylab/diffusion_denoised_smoothing

0.0 0.0 0.0 959 KB

Certified robustness "for free" using off-the-shelf diffusion models and classifiers

License: MIT License

Python 100.00%

diffusion_denoised_smoothing's Introduction

Diffusion Denoised Smoothing

This is a PyTorch implementation of Diffusion Denoised Smoothing, from the following paper:

(Certified!!) Adversarial Robustness for Free!. ICLR 2023.
Nicholas Carlini*, Florian Tramèr*, Krishnamurthy Dvijotham, Leslie Rice, Mingjie Sun, Zico Kolter

This repository is based on locuslab/smoothing, openai/improved-diffusion and openai/guided-diffusion.


We show how to achieve state-of-the-art certified adversarial robustness to 2-norm bounded perturbations by relying exclusively on off-the-shelf pretrained models.

Setup

Create an new conda virtual environment:

conda create -n diffusion_smoothing python=3.8 -y
conda activate diffusion_smoothing

Install Pytorch, torchvision following official instructions. For example:

conda install pytorch==1.12.0 torchvision==0.13.0 cudatoolkit=11.3 -c pytorch

Clone this repo and install the dependencies:

git clone https://github.com/ethz-privsec/diffusion_denoised_smoothing.git
pip install timm transformers statsmodels

We use these class-unconditional diffusion models from these repos:
CIFAR-10: Unconditional CIFAR-10 with L_hybrid objective.
ImageNet: Uncondtional 256x256 diffusion.
Remember to download these model checkpoints in the corresponding directory.

Evaluation

We give example evaluation command to run certification on CIFAR-10 and ImageNet.

# CIFAR-10
python cifar10/certify.py \
--sigma 1.00 --skip 1 --N0 100 --N 100000 --batch_size 200 \
--outfile [file to store certification results]
# ImageNet
python imagenet/certify.py \
--sigma 1.00 --skip 50 --N0 100 --N 10000 --batch_size 200 \
--outfile [file to store certification results]

License

This project is released under the MIT license. Please see the LICENSE file for more information.

Citation

If you find this repository helpful, please consider citing:

@Article{carlini2023free,
  author  = {Nicholas Carlini and Florian Tramèr and Krishnamurthy Dvijotham and Leslie Rice and Mingjie Sun and Zico Kolter},
  title   = {(Certified!!) Adversarial Robustness for Free!},
  journal = {International Conference on Learning Representations (ICLR)},
  year    = {2023},
}

diffusion_denoised_smoothing's People

Contributors

ftramer avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.