Three implementations of Neural Network with back-propagation are demonstrated in the repository
There are three different implementations of Neural Network from scratch which uses backpropagation and gradient Descent.
DATASET USED: The dataset used is moons dataset provided in sklear library of python
Part 1) Contains mini-batch gradient descent
Part 2) uses the annealing schedule to change the value of epsilon
Part 3) uses sigmoidal activation function istead of tanh which is used in other examples.
The loss function is printed at each iteration and in the end the graph is plotted for model build on the moons dataset.
Run Test.py for running each example.