GithubHelp home page GithubHelp logo

stjordanis / connxr Goto Github PK

View Code? Open in Web Editor NEW

This project forked from alrevuelta/connxr

0.0 2.0 0.0 87.92 MB

Pure C ONNX runtime with zero dependancies for embedded devices

License: MIT License

Makefile 0.67% C 55.88% PureBasic 22.88% Python 11.81% C++ 8.54% Jupyter Notebook 0.22%

connxr's Introduction

๐Ÿค– cONNXr C ONNX Runtime

macos-latest ubuntu-latest windows-latest

A onnx runtime written in pure C99 with zero dependencies focused on embedded devices. Run inference on your machine learning models no matter which framework you train it with and no matter the device that you use. This is the perfect way to go in old hardware that doesn't support fancy modern C or C++.

๐Ÿ“— Documentation

Documentation about the project, how to collaborate, architecture and much more. Available here

๐ŸŽ“ Introduction

This repo contains a pure C99 runtime to run inference on onnx models. You can train your model with you favourite framework (tensorflow, keras, sklearn) and once trained export it to a .onnx file, that will be used to run inference. This makes this library totally framework agnostic, no matter how you train your model, this repo will run it using the common interface that onnx provides. This runtime was thought for embedded devices, that might not be able to compile newer cpp versions. No GPUs nor HW accelerators, just pure non multi-thread C99 code, compatible with almost any embedded device. Dealing with old hardware? This might be also for you.

This project can be also useful if you are working with some bare metal hardware with dedicated accelerators. If this is the case, you might find useful to reuse the architecture and replace the specific operators by your own ones.

Note that this project is in a very early stage so its not even close to be production ready. Developers are needed so feel free to contact or contribute with a pull request. You can also have a look to the opened issues if you want to contribute, specially the ones labeled for beginners. See contributing section.

๐Ÿ–ฅ Out of the box examples

Some very well known models are supported out of the box, just compile the command line as follows and call it with two parameters (first the ONNX model, and second the input to run inference on). Note that the input has to be a .pb file. If you have your own model and its not working, its probably because its using an operator that we haven't implemented yet, so feel free to open an issue and we will happy to help.

make all
build/connxr test/mnist/model.onnx test/mnist/test_data_set_0/input_0.pb
build/connxr test/tiny_yolov2/Model.onnx test/tiny_yolov2/test_data_set_0/input_0.pb
build/connxr test/super_resolution/super_resolution.onnx test/super_resolution/test_data_set_0/input_0.pb
build/connxr test/mobilenetv2-1.0/mobilenetv2-1.0.onnx test/mobilenetv2-1.0/test_data_set_0/input_0.pb

TODO:

โš™ Example

If you want to use cONNXr as part of your code, you can either include all the files in your project and compile them, or perhaps link it as a static library, but this second option is not supported yet.

int main()
{
  /* Open your onnx model */
  Onnx__ModelProto *model = openOnnxFile("model.onnx");

  /* Create your input tensor or load a protocol buffer one */
  Onnx__TensorProto *inp0 = openTensorProtoFile("input0.pb");

  /* Set the input name */
  inp0set0->name = model->graph->input[0]->name;

  /* Create the array of inputs to the model */
  Onnx__TensorProto *inputs[] = { inp0set0 };

  /* Resolve all inputs and operators */
  resolve(model, inputs, 1);

  /* Run inference on your input */
  Onnx__TensorProto **output = inference(model, inputs, 1);

  /* Print the last output which is the model output */
  for (int i = 0; i < all_context[_populatedIdx].outputs[0]->n_float_data; i++){
      printf("n_float_data[%d] = %f\n", i, all_context[_populatedIdx].outputs[0]->float_data[i]);
  }
}

๐Ÿท Related Projects

Other C/C++ related projects: onnxruntime, darknet, uTensor, nnom, ELL, plaidML, deepC, onnc

โ›“ Limitations

  • Few basic operators are implemented, so a model that contains a not implemented operator will fail.
  • Each operator works with many data types (double, float, int16, int32). Only few of them are implemented.
  • The reference implementation is with float, so you might run into troubles with other types.
  • As a general note, this project is a proof of concept/prototype, so bear that in mind.

๐Ÿ“Œ Disclaimer

This project is not associated in any way with ONNX and it is not an official solution nor officially supported by ONNX, it is just an application build on top of the .onnx format that aims to help people that want to run inference in devices that are not supported by the official runtimes. Use at your own risk.

๐Ÿ“— License

MIT License

connxr's People

Contributors

alrevuelta avatar bjornite avatar ilou89 avatar kraiskil avatar mdhimes avatar nopeslide avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.