GithubHelp home page GithubHelp logo

josaklil-ai / aug_active_learning Goto Github PK

View Code? Open in Web Editor NEW

This project forked from aminparvaneh/alpha_mix_active_learning

0.0 0.0 0.0 538 KB

Implementation of Active Learning by Feature Mixing (ALFA-Mix) paper + Augmentation method

Python 100.00%

aug_active_learning's Introduction

Active Learning by Feature Mixing (ALFA-Mix)

PyTorch implementation of ALFA-Mix. For details, read the paper Active Learning by Feature Mixing, which is accepted in CVPR 2022.

ALFA-Mix

The code includes the implementations of all the baselines presented in the paper. Parts of the code are borrowed from https://github.com/JordanAsh/badge.

Setup

The dependencies are in requirements.txt. Python=3.8.3 is recommended for the installation of the environment.

Datasets

The code supports torchvision built-in implementations of MNIST, EMNIST, SVHN, CIFAR10 and CIFAR100. Additionally, it supports MiniImageNet, DomainNet-Real (and two subsets of that) and OpenML datasets.

Training

For running ALFA-Mix in a single setting, use the following script that by default uses 5 different initial random seeds:

python main.py \
        --data_name MNIST --data_dir your_data_directory --log_dir your_log_directory \
        --n_init_lb 100 --n_query 100 --n_round 10 --learning_rate 0.001 --n_epoch 1000 --model mlp \
        --strategy AlphaMixSampling --alpha_opt

To run the closed-form variation of ALFA-Mix, set --alpha_closed_form_approx --alpha_cap 0.2. It includes the implementations of all the baselines reported in the paper, including:

Evaluation

To evaluate over all the experiments and get the final comparison matrix, all the results should be gathered in a folder with this structure: Overall/Dataset/Setting (e.g. Overall/MNIST/MLP_small_budget). The script below accumulates all results over various settings in each dataset and generates the overall comparison matrix:

python agg_results.py --directory Path/Overall --dir_type general 

By using dataset or setting for --dir_type, you can evaluate the results at dataset or setting level respectively.

Citing

@inproceedings{parvaneh2022active,
  title={Active Learning by Feature Mixing},
  author={Parvaneh, Amin and Abbasnejad, Ehsan and Teney, Damien and Haffari, Gholamreza Reza and van den Hengel, Anton and Shi, Javen Qinfeng},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  pages={12237--12246},
  year={2022}
}

aug_active_learning's People

Contributors

aminparvaneh avatar josaklil-ai avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.