GithubHelp home page GithubHelp logo

jasonericzhan / dda Goto Github PK

View Code? Open in Web Editor NEW

This project forked from moskomule/dda

0.0 1.0 0.0 2.79 MB

Differentiable Data Augmentation Library

License: MIT License

Python 4.95% Jupyter Notebook 95.05%

dda's Introduction

Differentiable Data Augmentation Library

This library is a core of Faster AutoAugment and its descendants. This library is research oriented, and its AIP may change in the near future.

Requirements and Installation

Requirements

Python>=3.8
PyTorch>=1.5.0
torchvision>=0.6
kornia>=0.2

Installation

pip install -U git+https://github.com/moskomule/dda

APIs

dda.functional

Basic operations that can be differentiable w.r.t. the magnitude parameter mag. When mag=0, no augmentation is applied, and when mag=1 (and mag=-1 if it exists), the severest augmentation is applied. As introduced in Faster AutoAugment, some operations use straight-through estimator to be differentiated w.r.t. their magnitude parameters.

def operation(img: torch.Tensor,
              mag: Optional[torch.Tensor]) -> torch.Tensor:
    ...

dda.pil contains the similar APIs using PIL (not differentiable).

dda.operations

class Operation(nn.Module):
   
    def __init__(self,
                 initial_magnitude: Optional[float] = None,
                 initial_probability: float = 0.5,
                 magnitude_range: Optional[Tuple[float, float]] = None,
                 probability_range: Optional[Tuple[float, float]] = None,
                 temperature: float = 0.1,
                 flip_magnitude: bool = False,
                 magnitude_scale: float = 1,
                 debug: bool = False):
        ...

If magnitude_range=None, probability_range=None, then magnitude, probability is not Parameter but Buffer, respectively.

magnitude moves in magnitude_scale * magnitude_range. For example, dda.operations.Rotation has magnitude_range=[0, 1] and magnitude_scale=30 so that magnitude is between 0 to 30 degrees.

To differentiate w.r.t. the probability parameter, RelaxedBernoulli is used.

Examples

Citation

dda (except RandAugment) is developed as a core library of the following research projects.

If you use dda in your academic research, please cite hataya2020a.

@inproceesings{hataya2020a,
    title={{Faster AutoAugment: Learning Augmentation Strategies using Backpropagation}},
    author={Ryuichiro Hataya and Jan Zdenek and Kazuki Yoshizoe and Hideki Nakayama},
    year={2020},
    booktitle={ECCV}
}

...

dda's People

Contributors

moskomule avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.