GithubHelp home page GithubHelp logo

bxf0123 / panda Goto Github PK

View Code? Open in Web Editor NEW

This project forked from talreiss/panda

0.0 0.0 0.0 38 KB

PANDA: Adapting Pretrained Features for Anomaly Detection and Segmentation (CVPR 2021)

Home Page: https://arxiv.org/pdf/2010.05903.pdf

License: Other

Python 100.00%

panda's Introduction

PANDA

Official PyTorch implementation of “PANDA: Adapting Pretrained Features for Anomaly Detection and Segmentation” (CVPR 2021).

Virtual Environment

Use the following commands:

cd path-to-PANDA-directory
virtualenv venv --python python3
source venv/bin/activate
pip install -r requirements.txt --find-links https://download.pytorch.org/whl/torch_stable.html

Data Preparation

Use the following commands:

cd path-to-PANDA-directory
mkdir data

Download:

Extract these files into path-to-PANDA-directory/data and unzip tiny.zip

Experiments

To replicate the results on CIFAR10, FMNIST for a specific normal class with EWC:

python panda.py --dataset=cifar10 --label=n --ewc --epochs=50
python panda.py --dataset=fashion --label=n --ewc --epochs=50

To replicate the results on CIFAR10, FMNIST for a specific normal class with early stopping:

python panda.py --dataset=cifar10 --label=n
python panda.py --dataset=fashion --label=n

Where n indicates the id of the normal class.

To run experiments on different datasets, please set the path in utils.py to the desired dataset.

OE Experiments

To replicate the results on CIFAR10 for a specific normal class:

python outlier_exposure.py --dataset=cifar10 --label=n

Where n indicates the id of the normal class.

Further work

See our new paper “Mean-Shifted Contrastive Loss for Anomaly Detection” which achieves state-of-the-art anomaly detection performance on multiple benchmarks including 97.5% ROC-AUC on the CIFAR-10 dataset.

GitHub Repository

Video Anomaly Detection

See our new paper “Attribute-based Representations for Accurate and Interpretable Video Anomaly Detection” which achieves state-of-the-art video anomaly detection performance on multiple benchmarks including 85.9% ROC-AUC on the ShanghaiTech dataset.

GitHub Repository

Citation

If you find this useful, please cite our paper:

@inproceedings{reiss2021panda,
  title={PANDA: Adapting Pretrained Features for Anomaly Detection and Segmentation},
  author={Reiss, Tal and Cohen, Niv and Bergman, Liron and Hoshen, Yedid},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={2806--2814},
  year={2021}
}

panda's People

Contributors

talreiss avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.