A six layer perceptron including five hidden layers and one output layer. Using ReLU as activation function, the five hidden layers transform the input features from 784 dimensions to 16 dimensions. After 10 epochs of training, the nn shows great acc on Fashion-MNIST.
csu-schwarze / six_layer_perceptron Goto Github PK
View Code? Open in Web Editor NEWA six layer perceptron including five hidden layers and one output layer. Using ReLU as activation function, the five hidden layers transform the input features from 784 dimensions to 16 dimensions. After 10 epochs of training, the nn shows great acc on Fashion-MNIST.
License: MIT License