Comments (3)
Very fair request. We had #30 for this but haven't had time to look into it.
Some quick information that hope you find useful. The inference
function runs inference on a model given a set of inputs.
inference(Onnx__ModelProto *model, Onnx__TensorProto **inputs, int nInputs)
That input is propagated towards the output, and in all_context
global variable you can access each node output.
So it shouldn't be very difficult to do. Some quick C pseudocode-ish. Have a look to the documentation for more info about the struct Onnx__TensorProto
.
int main()
{
// Create your tensor and alloc some mem
Onnx__TensorProto *inp0set0 = ...;
// Allocate memory for dims, float_data,...
inp0set0->float_data[0] = 10;
inp0set0->float_data[1] = 10;
// Note that you have to also fill the n_dims, dims, name
inp0set0->name = ...;
/* Open your onnx model */
Onnx__ModelProto *model = openOnnxFile("model.onnx");
/* Set the input name */
inp0set0->name = model->graph->input[0]->name;
/* Create the array of inputs to the model */
Onnx__TensorProto *inputs[] = { inp0set0 };
/* Resolve all inputs and operators */
resolve(model, inputs, 1);
/* Run inference on your input */
Onnx__TensorProto **output = inference(model, inputs, 1);
/* Print the last output which is the model output */
for (int i = 0; i < all_context[_populatedIdx].outputs[0]->n_float_data; i++){
printf("n_float_data[%d] = %f\n", i, all_context[_populatedIdx].outputs[0]->float_data[i]);
}
}
Let me know if you need further assistance. Have a look into inference.c
file and ``connxr.c`, you can reuse some stuff. And btw, if you develop one example would be great to the examples folder or existing documentation.
Edit: The code was wrong. Forgot to call the resolve function.
from connxr.
Just edited the above code, there was a mistake.
from connxr.
Closing this due to inactivity, let us know if you need further support.
from connxr.
Related Issues (20)
- How to convert the model input to a .pb file?
- Two missing null check for return values of searchAttributeNyName()
- Help running inference using yolov5, 6 & 8 HOT 4
- remove/integrate old trace.{c,h} into tracing.h
- Add C++ ifdef inside header to disable mangling
- Github action windows build fails
- Wrong printf format string for type int32_t
- Support for hardware without file systems HOT 2
- Memory leaks HOT 4
- Add support for ConvTranspose2d
- Tracing not compatible with C99 HOT 4
- Benchmarking and time.h on Windows
- cannot find -lcunit HOT 2
- build fail on linux and mac HOT 5
- tiny yolov2 doesn't work on raspberry pi zero HOT 9
- src/inference.c line 29 HOT 1
- using macro function to reduce execute_operator_***.c file to single c source file HOT 1
- modify src/inference.c file target to resolve once and not depend on inputs HOT 1
- Add operator can be write in macro without type HOT 1
- example use other input HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from connxr.