axera-tech / pulsar-docs Goto Github PK
View Code? Open in Web Editor NEWThe docs repository of Pulsar which is AXera's SoC AI toolchain. Such as AX630A, AX620A, AX620U
License: BSD 3-Clause "New" or "Revised" License
The docs repository of Pulsar which is AXera's SoC AI toolchain. Such as AX630A, AX620A, AX620U
License: BSD 3-Clause "New" or "Revised" License
I am running:
pulsar run\
resnet18_export_data/model/resnet18.joint resnet18_export_data/model/resnet18.onnx\
--input resnet18_export_data/images/cat.jpg\
--config resnet18_export_data/config/output_config.prototxt\
--output_gt inference_results # weird arg name
using the Resnet18 quick start data. However, I get an error:
[18 11:00:46 <frozen super_pulsar.model_executor>:266] DBG [pulsar build] File "<frozen super_pulsar.toolchain_wrappers.wrapper_hat_maker>", line 117, in get_io_modification_onnx
[18 11:00:46 <frozen super_pulsar.model_executor>:266] DBG [pulsar build] TypeError: get_modification_onnx() missing 1 required positional argument: 'meta_dict'
[18 11:00:46 <frozen super_pulsar.func_wrappers.wrapper_pulsar_run>:242] ERR failed loading resnet18_export_data/model/resnet18.onnx, skipping
[18 11:00:46 <frozen super_pulsar.func_wrappers.wrapper_pulsar_run>:244] DBG Traceback (most recent call last):
File "<frozen super_pulsar.func_wrappers.wrapper_pulsar_run>", line 240, in main
File "<frozen super_pulsar.func_wrappers.pulsar_run.utils>", line 31, in load_model
File "<frozen super_pulsar.model_executor>", line 338, in __init__
File "<frozen super_pulsar.model_executor>", line 271, in wrap_src_model_to_joint
RuntimeError: pulsar build returned 1
.joint
get's exported. However, the ONNX model cannot be loaded and hence the inference comparison cannot be executed. It seems to lack some metadata. Any idea on how to fix this?
Version information of my toolchain:
root@0bed5bfd8f7b:/data# pulsar version
[W Context.cpp:69] Warning: torch.set_deterministic is in beta, and its design and functionality may change in the future. (function operator())
0.6.1.20
07305a6
https://pulsar-docs.readthedocs.io/zh_CN/latest/pulsar/introduction.html#id1
这里的
221、222
太晦涩了, “221 和 222 各使用一半的 NPU 资源” 这句不是贵公司外的人能理解的。
I want to run inference on my exported .joint
model to be able to evaluate its INT8 COCO mAP performance. I only see one way of doing this:
pulsar run\
my_model.joint\
--input resnet18_export_data/images/cat.jpg\
--output_gt inference_results
And then read the the generated .npy
file. I want to avoid reading from file as it is a very time consuming operation. Is there a way of returning the results from the system call itself? Are you planning to implement this?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.