GithubHelp home page GithubHelp logo

a-moadel / deeplut Goto Github PK

View Code? Open in Web Editor NEW
4.0 4.0 1.0 217 KB

A Python library that aims to provide a flexible, extendible, lightening fast, and easy-to-use framework to train look-up tables (LUT) deep neural networks from scratch.

License: Other

Python 100.00%

deeplut's Introduction

DeepLUT: an End-to-End Library for Training Lookup Tables

Welcome to DeepLUT

Contents:

What is DeepLut?

Deeplut Architecture

A Python library that aims to provide a flexible, extendible, lightening fast, and easy-to-use framework to train look-up tables (LUT) deep neural networks from scratch. Deeplut is focusing in helping researchers to:

  • More rapid prototyping
  • Ease of reproducing and comparing results obtained by other methods.
  • Consistent integration with hardware synthesis tools.
  • Ease of extending the framework and innovating new ideas.

Deeplut is organized in the following modules

  • Initalizers: These modules contain the implementation of initializations. It currently includes an implementation based on a learning and memorization paper.

  • Trainers: These modules should contain a variety of differentiable lookup table representations. In one implementation, Lagrange interpolation is used. They serve as an abstraction layer for hardware lookup tables, which can be trained with back propagation and used as building blocks for higher layers.

  • Mask Builder: These modules provide various implementations of how weights in normal neural network layers can be displayed in look-up tables. There are currently two implementations available. Expanded implementation based on modeling each weight as a standalone lookup table and randomly filling in the remaining inputs from a specific set of inputs determined by the layer implementation. In contrast, the Minimal implementation groups multiple weights into a single look-up table based on the current K.

  • Layers: These modules contain various layer implementations and use the same naming convention as the Pytorch "nn" module. Currently, we have two implementations: linear and conv2d. The ultimate goal is to mimic all of Pytorch's main layers, which will provide a great deal of flexibility and the ability to implement more complex architectures.

  • Optimizers: These modules provide various optimizer wrappers that can be used to wrap Pytorch optimizers and adapt them for binary and LUT training.

Why DeepLut is needed?

We believe having a flexible, extendible, fast, and easy framework will help in advancing the research in this area. Frameworks enable innovation and make researchers focus on experimentation. Extendible will help in building an ecosystem between researchers to add plugins and implement innovative modules to benefit all other researchers.

Code Examples

Imports

import torch
import torch.nn as nn
from deeplut.nn.Linear import Linear as dLinear
from deeplut.optim.OptimWrapper import OptimWrapper as dOptimWrapper
from deeplut.trainer.LagrangeTrainer import LagrangeTrainer
from deeplut.mask.MaskExpanded import MaskExpanded

Model

class LFC(nn.Module):
    def __init__(self, k,input_expanded, mask_builder_type, device):
        super(LFC, self).__init__()
        self.layers = nn.Sequential(
          nn.Flatten(),
          Linear(784,256),
          nn.BatchNorm1d(256),
          nn.ReLU(),     
          dLinear(256,256,k=k,binarization_level=1,input_expanded=input_expanded,trainer_type=LagrangeTrainer, mask_builder_type=mask_builder_type,bias=False,device=device),
          nn.BatchNorm1d(256),
          nn.ReLU(),   
          dLinear(256,256,k=k,binarization_level=1,input_expanded=input_expanded,trainer_type=LagrangeTrainer, mask_builder_type=mask_builder_type,bias=False,device=device),
          nn.BatchNorm1d(256),
          nn.ReLU(),  
          dLinear(256,256,k=k,binarization_level=1,input_expanded=input_expanded,trainer_type=LagrangeTrainer, mask_builder_type=mask_builder_type,bias=False,device=device),
          nn.BatchNorm1d(256),
          nn.ReLU(),       
          dLinear(256,10,k=k,binarization_level=1,input_expanded=input_expanded,trainer_type=LagrangeTrainer, mask_builder_type=mask_builder_type,bias=False,device=device),
          nn.BatchNorm1d(10),  
          )

Optimizer

_optim = optim.Adam(model.parameters(), lr=0.01)
optimizer = dOptimWrapper(_optim, BinaryOptim = True)

Main

model = LFC(k=2, input_expanded = True, mask_builder_type=MaskExpanded, device= None)

deeplut's People

Contributors

a-moadel avatar madel0093 avatar

Stargazers

 avatar Jeff Carpenter avatar Jianyi Cheng avatar Ben Biggs avatar

Watchers

Mohamed S. Abdelfattah avatar Jordan Dotzel avatar Ghada Sokar avatar  avatar

Forkers

nickilioudis

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.