GithubHelp home page GithubHelp logo

theolepage / prophecy Goto Github PK

View Code? Open in Web Editor NEW
9.0 0.0 2.0 144 KB

A tiny deep neural network framework developed from scratch in C++ and CUDA.

Makefile 2.52% C++ 97.48%
machine-learning neural-network machine-learning-from-scratch machine-learning-framework ml-framework cpp17 cpp meta-project-show meta-project-color-8877f4 meta-project-order-2

prophecy's Introduction

prophecy

Check this out Google.

Usage

#include <iostream>
#include <memory>

#include "model/model.hh"
#include "layer_implem/dense_layer.hh"

using model_type = float;
using training_set = std::vector<Matrix<model_type>>;

static auto get_xor(unsigned a, unsigned b)
{
    auto mx = Matrix<model_type>(2, 1);
    mx(0, 0) = a;
    mx(1, 0) = b;

    auto my = Matrix<model_type>(1, 1);
    my(0, 0) = a^b;

    return std::make_pair(mx, my);
}

static void create_dataset(
        training_set& x_train,
        training_set& y_train)
{
    auto a = get_xor(0, 0);
    x_train.emplace_back(a.first);
    y_train.emplace_back(a.second);

    auto b = get_xor(0, 1);
    x_train.emplace_back(b.first);
    y_train.emplace_back(b.second);

    auto c = get_xor(1, 0);
    x_train.emplace_back(c.first);
    y_train.emplace_back(c.second);

    auto d = get_xor(1, 1);
    x_train.emplace_back(d.first);
    y_train.emplace_back(d.second);
}

int main(void)
{
    Model<model_type> model = Model<model_type>();
    SigmoidActivationFunction s = SigmoidActivationFunction<model_type>();

    // Create model
    model.add(new InputLayer<model_type>(2));
    model.add(new DenseLayer<model_type>(2, s));
    model.add(new DenseLayer<model_type>(1, s));

    // Create dataset
    auto x_train = training_set();
    auto y_train = training_set();
    create_dataset(x_train, y_train);

    // Train model
    model.compile(0.1);
    model.train(x_train, y_train, 10000, 1);

    // Test the model
    for (size_t i = 0; i < x_train.size(); i++)
    {
        auto x = x_train.at(i);
        auto x_t = x.transpose();
        auto y = model.predict(x);
        std::cout << "Input:  " << x_t;
        std::cout << "Output: " << y << std::endl;
    }

    return 0;
}

To-Do

Refer to this page.

prophecy's People

Contributors

ariellelevi avatar kh4ster avatar theolepage avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

prophecy's Issues

Improve model

  • Compile automatically layers, compiled should be only used in model when optimizer has been choosen
  • Use accuracy on progress and evaluate and show ETA/time for progress
  • Validation split
  • License

Add tests and docs

  • Asserts and exceptions
  • Write unit tests (layers, compute, model)
  • Write docs (layers, compute, model)

Improve dense layer

  • Batch training with mini-batch

  • Implement cross entropy cost function
  • Cost/loss average over batch for progress display?

  • Regularization

  • Better initialization of weights and biases
  • Learning rate decay (optimizers, custom function)

  • Adam optimizer

Refactoring

  • Release and debug mode, Flags compilation, cpp 20
  • Merge layer and layer_implem
  • Include root path (remove ..)
  • typedef uint?
  • namespace prophecy, core
  • Coding style, trailing spaces (clang)
  • Use _ suffix
  • Make ActivationFunction abstract
  • Put const on f_ and fd_ of activation functions
  • Rename variables train model, scope
  • Compiling must not be mandatory to predict
  • When to use = delete on ctor/dtor, put explicit?
  • Remove dataset_handler and conv
  • Layer feedforward last param default false
  • Put implementations below class declaration

Prophecy 2.0

  • Simple tensor class
  • Layers: dense, relu, sigmoid, softmax
  • Model: predict, fit (validation), evaluate, summary
  • Loss: MSE, cross-entropy
  • Metrics: loss, accuracy
  • Weights regularization
  • Learning rate schedulers
  • Optim: Adam, SGD
  • Python bindings
  • Export/import weights
  • Optmizations: CUDA, OpenMP
  • Clean CPP, cmake, tests, docs, example notebooks

Objective: fast and accurate classification of MNIST.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.