GithubHelp home page GithubHelp logo

tinny's Introduction

TiNNy: Simple neural networks built from scratch with just Python and Numpy

ab3c57d8c8851397c82aa3868b60b611

TiNNy is a minimalist project aimed at demonstrating the power and potential of neural networks using nothing but Python and Numpy. This project is designed for educational purposes, allowing users to understand the fundamentals of neural network architectures, forward propagation, backpropagation, and training processes in a hands-on manner.

Dive Into Demos

TiNNy comes equipped with two sample Python notebooks. These demos are a great way to see TiNNy in action:

  • demo_classification.ipynb: TiNNy classification project using the MNIST handwritten digit dataset and a Heart attack prediction dataset.
  • demo_regression.ipynb: TiNNy regression project using the Boston housing price and the Diamond price prediction datasets.

Math explained

We begin by having an input of features X and an input of labels y, as neural networks are part of supervised learning. The goal is to learn a function that maps inputs to outputs, minimizing the error between the predicted outputs and the actual labels. This process involves several steps, prominently including the forward pass, backpropagation, and parameter update. Here's how each step works mathematically:

Forward

In the forward pass, the network applies a series of transformations to the input data to compute the output predictions. This involves computing the linear combination of inputs and weights, adding a bias, and then applying an activation function. The process for layer i can be visualized as:

equation1

Here, L^i represents the linear combination at layer i, w^i and b^i are the weights and bias for layer i, respectively, and A^(i-1) is the activation from the previous layer or the input data for the first layer. A^i is the output of the activation function applied to L^i, moving the data forward through the network.

Backpropagation

Backpropagation is the process of adjusting the network's weights and biases to minimize the error between the actual labels and the predictions. It involves two key steps: computing the gradients for the output layer and then for the dense (hidden) layers.

Backpropagation in Output Layer

The gradient of the loss with respect to the output layer's weights, biases, and activations are computed as follows:

equation2

dL^k denotes the derivative of the loss with respect to the activations of the last layer, dW^k and db^k are the gradients of the loss with respect to the weights and biases of the last layer, respectively. A^k is the activation of the last layer, and y is the true labels.

Backpropagation in Dense Layers

For each of the preceding layers, the gradients are computed taking into account the derivative of the activation function to propagate the error backward through the network:

equation3

Here, g'(L^i) represents the derivative of the activation function applied at layer i, allowing the gradient of the loss to be propagated back through the network.

Update Parameters

Finally, the parameters of the network are updated using the gradients computed during backpropagation. This step adjusts the weights and biases in the direction that minimally reduces the error:

equation4

W^i and b^i are updated by subtracting a fraction of the gradients dW^i and db^i, scaled by a learning rate alpha. This iterative process of forward pass, backpropagation, and parameter update continues until the model's performance reaches a satisfactory level.

Forward

Getting Started with TiNNy

Start using TiNNy by setting up the project on your personal computer. Follow these simple steps to get started:

First, clone the repository to your local machine using Git. Open your terminal and run the following commands:

git clone https://github.com/rodmarkun/tiNNy
cd tiNNy

TiNNy requires certain Python packages to function properly. Ensure all dependencies are installed by executing the following command in the terminal:

pip install -r ./requirements.txt

With the installation complete, you're ready to incorporate TiNNy into your Python scripts or Jupyter notebooks. Simply import the library using:

import tinny

You are now equipped to explore the features and capabilities of TiNNy. Happy coding!

tinny's People

Contributors

rodmarkun avatar

Stargazers

 avatar  avatar  avatar  avatar CodeC avatar  avatar

Watchers

Kostas Georgiou avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.