GithubHelp home page GithubHelp logo

mocquin / autodiff Goto Github PK

View Code? Open in Web Editor NEW

This project forked from ryanirl/autodiff

0.0 0.0 0.0 5.23 MB

An automatic differentiation library written in Python with NumPy vectorization.

License: MIT License

Python 100.00%

autodiff's Introduction

Autodiff



AutoDiff is a lightweight transparent reverse-mode automatic differentiation (a.k.a backpropagation) library written in Python with NumPy vectorization. AutoDiff works by breaking up larger user defined functions into primitive operators (such as addition, muliplication, etc.) whos derivatives are pre-defined. In the forward pass, Autodiff dynamically builds a computation graph of the larger user defined function using these primitive operators as nodes of the computation graph. Next, to calculate the respective derivatives w.r.t. each variable the chain rule is applied during the backwards pass of the computation graph. Though there are various techniques for implementing reverse-mode automatic differentiation. AutoDiff utilises Python's operator overloading capabilities which is by far the simplier and more intuitive of the methods. Autodiff is currently split into 2 levels:

Level 1 (L1): Introduces the base functionality of Autodiff. Level 1 defines the Tensor class, supports primitive operators, and includes a decorator that allows users to create custom "primitive" ops on top of Autodiff.

Level 2 (L2): Everything inside of the NN folder. Level 2 adds on top of Autodiff using the "register" decorator defined in Level 1. Level 2 introduces various loss functions, activation functions, and more.

Dependencies: NumPy.


Table of Contents


Currently Supported NN Features:

Activation Functions:
Activation Function Implimented
ReLU
Leaky ReLU
PReLU
Sigmoid
Log Sigmoid
Softmax
Log Softmax
TanH
Loss Functions:
Loss Function Implimented
MAE (L1)
MSE (L2)
Binary CE
Categorical CE
Sigmoid & Soft w/CE
Hinge Loss
Optimizers:
Optimizer Implimented
SGD
SGD w/ Momentum
Nestrov Momemtum
AdaGrad
RMSProp
Adam
AdaDelta
Layers:
Layer Implimented
Linear
Sequential
2DConvolution
Max Pooling
Batch Norm
Dropout

Installation

For the moment, you have to clone the library to play with it.


Usage

Basic Example:

from autodiff.tensor import Tensor

a = Tensor(2)
b = Tensor(3)

# This is the same as writing f(x) = ((a + b) * a)^2
# We just break it down into primitive ops.

z = a + b
y = z * a
x = y ** 2

# This is where the magic happens
x.backward()

print("value of x: ({})".format(x.value))
print("grad of x wrt a: ({})".format(a.grad))
print("grad of x wrt b: ({})".format(b.grad))

NN Example:

Bulding an MLP to do multiclass Softmax classification is as simple as this: This example in full detail can be found here: https://github.com/ryanirl/autodiff/blob/main/examples/spiral_classification.py

from autodiff.tensor import Tensor
import autodiff.nn as nn

# Instantiating the Model
model = nn.Sequential(
    nn.Linear(2, 100),
    nn.ReLU(),
    nn.Linear(100, 3),
    nn.Softmax()
)

# Defining the Loss
loss_fun = nn.CategoricalCrossEntropy()

# Defining the Optimizer
optimizer = nn.Adam(model.parameters())

# Training
for i in range(1000):
    optimizer.zero_grad()

    out = model.forward(X)

    loss = loss_fun(out, y)

    loss.backward()

    optimizer.step()

Plotting the decision boundry gives:


User Defined Primitives

You can even define your own primitive functions. An example of a user defined primitive function may be:

from autodiff.tensor import Tensor, register
import numpy as np

# For simplicity
def e(x): return np.exp(x.value)

@register
class tanh:
    def forward(x):
        return (e(x) - e(-x)) / (e(x) + e(-x))

    def backward(g, x, z):
        return [g * (1.0 - (z.value ** 2))]


x = Tensor([1, 2, 3])
y = x.tanh()

y.backward()

print("The gradient of y wrt x: {}".format(x.grad))

# OUTPUTS: The gradient of y wrt x: [[0.41997434 0.07065082 0.00986604]]

License

Distributed under the MIT License. See LICENSE for more information.

autodiff's People

Contributors

ryanirl avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.