Comments (3)
@onnx/sig-operators
from onnx.
Good question. Looking at the onnx shape inference implementation, it computes the complete products (not skipping the forwarded dimensions). Hence, it will not be able to infer an output dimension and will flag this as an error here.
However, the more general interpretation could be useful in some situations, I guess ... it may be worth investigating whether backend implementations support it.
Do you see any examples/models where this will be useful? If so, it may be worth updating the spec to allow it.
from onnx.
It seems to be useful for dealing with batch dimensions that might be zero. For example, reshaping from [n,a,b] to [0,-1] where n is the batch dimension.
We (Nvidia TensorRT group) ran into the issue with fasterrcnn_resnet50_fpn.onnx (I think it's derived from here)> and accidentally fed it random data. I'm guessing there's some kind of internal batch dimension there, with a data-dependent length.
On the other hand, the "forwarding 0" is dangerous with networks that contain empty tensors, so there's much to be said for just discouraging "forwarding 0", even if it helps the use of wildcard -1. In retrospect, "forwarding -2" would have been a much better design, but Caffe chose 0.
from onnx.
Related Issues (20)
- Change dynamic shape in fixed shape using C++ HOT 1
- The weekly model zoo CI has been failing HOT 12
- Incorrect input names for quantize/dequantize ONNX backend node tests HOT 1
- Fix and test numpy_helper to_array
- How to read onnx and obtain names, imgsz HOT 3
- Model zoo test failures HOT 1
- DequantizeLinear spec clarification: What happens if the subtraction overflows/underflows? HOT 1
- the model for UniqueOp in backend test is not always correct for different input
- A conflict doc abount compatibility between Onnx Version & ML Opset Version HOT 2
- [Shape Inference] Robustness of `ConstantOfShape` operator HOT 2
- Numpy 2.0.0rc2 causes `pytest` on ONNX to fail HOT 2
- How to modify Slice attribute HOT 4
- Error While Installing ONNX HOT 3
- Export onnx model from pytorch with dynamic input size, output tensor doesn't match. HOT 9
- Discussion on the Correct Approach for Converting ONNX Models to Other Frameworks HOT 2
- Way to swap parameters inside .onnx graph (LoRA functionality) HOT 6
- docs/Operators.md missing input type description at Constant HOT 1
- React Native Support HOT 1
- [Question] Where is `onnx-operators-ml.pb.h` HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from onnx.