We will write codes for different ways to initialize the parameters, optimization checking, regularization techniques like dropout and L2 and other optimization techniques like mini_batch, stochastic gradient descent and Adam. All these codes will be written from scratch without using any in built ML or deep learning libraries or frameworks Here we have folder for each intialization techniques, optimization checking, regularization technique and Optimiztion techniques
xiaojiean / improving-performance-of-deep-neural-networks Goto Github PK
View Code? Open in Web Editor NEWThis project forked from shahjainil2406/improving-performance-of-deep-neural-networks
We will write codes for different ways to initialize the parameters, optimization checking, and regularization techniques like dropout and L2. All these codes will be written from scratch without using any in built ML or deep learning libraries or frameworks