GithubHelp home page GithubHelp logo

mrousavy / brabenetz Goto Github PK

View Code? Open in Web Editor NEW
26.0 5.0 9.0 4.31 MB

๐Ÿง  A fast and clean supervised neural network in C++, capable of effectively using multiple cores

C++ 87.02% C 1.20% CMake 10.34% Batchfile 0.76% Shell 0.68%
neural-network cpp neurons xor cpu biases supervised-neural-network algorithm machine-learning digit-recognizer

brabenetz's Introduction

BrabeNetz

Brain - Image by medium.com

BrabeNetz is a supervised neural network written in C++, aiming to be as fast as possible. It can effectively multithread on the CPU where needed, is heavily performance optimized and is well inline-documented. System Technology (BRAH) TGM 2017/18

NuGet Download on NuGet

PM> Install-Package BrabeNetz

I've written two examples of using BrabeNetz in the Trainer class to train a XOR ({0,0}=0, {0,1}=1, ..) and recognize handwritten characters.

In my XOR example, I'm using a {2,3,1} topology (2 input-, 3 hidden- and 1 output-neurons), but BrabeNetz is scalable until the hardware reaches its limits. The digits recognizer is using a {784,500,100,10} network to train handwritten digits from the MNIST DB.

Be sure to read the network description, and check out my digit recognizer written in Qt (using a trained BrabeNetz MNIST dataset)

Benchmarks

Build: Release x64 | Windows 10 64bit

CPU: Intel i7 6700k @ 4.0GHz x 8cores

RAM: HyperX Fury DDR4 32GB CL14 2400MHz

SSD: Samsung 850 EVO 540MB/s

Commit: 53328c3

Console output in digit recognition

Actual prediction of the digit recognizer network on macOS Mojave

Console output with elapsed time (2ms)

Training a XOR 1000 times takes just 0.49ms

Actual trained network prediction output for digit recognition

Actual prediction of the digit recognizer network on Debian Linux

Using 24/24 cores in Taskmanager

Effectively using all available cores (24/24, 100% workload)

Running on Linux (Task View - htop)

Task Resource viewer (htop) on Linux (Debian 9, Linux 4.9.62, KDE Plasma)

Specs

  • Optimized algorithms via raw arrays instead of std::vector and more
  • Smart multithreading by OpenMP anywhere the spawn-overhead is worth the performance gain
  • Scalability (Neuron size, Layer count) - only limited by hardware
  • Easy to use (Inputs, outputs)
  • Randomly generated values to begin with
  • Easily binary save/load with network::save(string)/network::load(string) (state.nn file)
  • Sigmoid squashing function
  • Biases for each neuron
  • network_topology helper objects for loading/saving state and inspecting network
  • brabenetz wrapper class for an easy-to-use interface

Usage

Example Usage Code

  1. Build & link library

  2. Choose your interface

    1. brabenetz.h: [Recommended] A wrapper for the raw network.h interface, but with error handling and modern C++ interface styling such as std::vectors, std::exceptions, etc.
    2. network.h: The raw network with C-style arrays and no bound/error checking. Only use this if performance is important.
  3. Constructors

    1. (initializer_list<int>, properties): Construct a new neural network with the given network size (e.g. { 2, 3, 4, 1 }) and randomize all base weights and biases - ref
    2. (network_topology&, properties): Construct a new neural network with the given network topology and import it's existing weights and biases - ref
    3. (string, properties): Construct a new neural network with and load the neural network state from the file specified in properties.state_file - ref
  4. Functions

    1. network_result brabenetz::feed(std::vector<double>& input_values): Feed the network input values and forward propagate through all neurons to estimate a possible output (Use the network_result structure (ref) to access the result of the forward propagation, such as .values to view the output) - ref
    2. double network_result::adjust(std::vector<double>& expected_output): Backwards propagate through the whole network to adjust wrong neurons for result trimming and return the total network error - ref
    3. void brabenetz::save(string path): Save the network's state to disk by serializing weights - ref
    4. void brabenetz::set_learnrate(double value): Set the network's learning rate. It is good practice and generally recommended to use one divided by the train count, so the learn rate decreases the more often you train - ref
    5. network_topology& brabenetz::build_topology(): Build and set the network topology object of the current network's state (can be used for network visualization or similar) - ref

Usage examples can be found here, and here

Thanks for using BrabeNetz!

Buy Me a Coffee at ko-fi.com

brabenetz's People

Contributors

imgbotapp avatar mrousavy avatar mrousavy-tgm avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

brabenetz's Issues

Fix backpropagation

Backpropagation does not work, it seems like something is cancelling itself out.

After 1000 trainings, 4 times each possibility the network's outputs have changed for a maximum of +/- 0.01.

Backpropagation Function is here.

See this for an explanation of all arrays/members of the network class.

Use NetworkTopology instead of arrays

Concept:

Use vector<T>s instead of all those T* arrays (T: double), meaning we don't need

  • the NetworkTopology to array copy (FillWeights(..) function)
  • to store sizes
  • to use those complicated array accesses
  • to use those complicated delete[]s

Pros:

  • Much more readable
  • Easier to maintain

Cons:

  • Slower (but how much? TODO: Benchmark it)

Compile with g++

Compile BrabeNetz Library Project for *nix systems with g++

(optionally: Build a console app for *nix systems aswell)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.