GithubHelp home page GithubHelp logo

teelinsan / kerasdropconnect Goto Github PK

View Code? Open in Web Editor NEW
36.0 2.0 12.0 10 KB

An implementation of DropConnect Layer in Keras

License: MIT License

Python 100.00%
keras machine-learning deep-learning keras-neural-networks keras-implementations regularization dropconnect drop connect

kerasdropconnect's Introduction

Hi there ๐Ÿ‘‹, welcome to my Gihub profile!

Google Scholar Twitter LinkedIn

I am a PhD student in Computer Science at GLADIA, Sapienza University of Rome (one of the top European universities in AI according to CS ranking). My research activity lies at the intersection of natural language processing, representation learning and machine intelligence.


teelinsan's stats

kerasdropconnect's People

Contributors

teelinsan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

kerasdropconnect's Issues

Need to scale the weights and biases

Hi,
Thanks for the implementation. However, I detected a bug in it.
It uses tf.nn.dropout which scales up the input by 1 / keep_prob (see its documentation). This is not required for dropconnect though. So you have to scale the weights and biases back. Here is the required change (given that the self.prob in you implementation is actually the probability of dropping, not keeping):

self.layer.kernel = K.in_train_phase(K.dropout(self.layer.kernel, self.prob), self.layer.kernel) * (1-self.prob)

self.layer.bias = K.in_train_phase(K.dropout(self.layer.bias, self.prob), self.layer.bias) * (1-self.prob)

Tensorflow and Keras Version

Hi,
I try to use dropcoonect, and already install on my machine,
but when i build and fit model, this error appear

a "Graph" tensor. It is possible to have Graph tensors
leak out of the function building context by including a
tf.init_scope in your function building code.
For example, the following function will fail:
  @tf.function
  def has_init_scope():
    my_constant = tf.constant(1.)
    with tf.init_scope():
      added = my_constant * 2
The graph tensor has name: Model/drop_connect/mul:0

I think there is a mismatch in tensorflow and keras version.
Tensorflow = 2.0.0
Keras = 2.4.0

This is how i use dropconnect

...
X = Dense(64, activation='relu')(X)  
X = DropConnect(layer=Dense(units=32),prob=0.2,)(X)
...

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.