GithubHelp home page GithubHelp logo

Comments (25)

vitoplantamura avatar vitoplantamura commented on May 18, 2024 1

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024 1

Vito, you mad lad, it worked!
I had a 128Gb swap external SSD that i used for swap for generating bigger images on my raspberry pi project , which is nothing fancy like this project!
I was thinking, maybe i can help and write a "guide" on this? Maybe it helps others (i saw some question popping of in the issues)

from onnxstream.

vitoplantamura avatar vitoplantamura commented on May 18, 2024 1

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024 1

Oh that is out of my knowledge to be completely honest... We might as well ask Vito if he knows how to fix it, if another model doesnt work

from onnxstream.

elliot-sawyer avatar elliot-sawyer commented on May 18, 2024 1

All good, part of the fun of learning new stuff! Thanks heaps for your input

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024

Thanks a lot for the kind explanation!
I tried to follow the explanations, so i tried some commands:

python -m onnxruntime.tools.make_dynamic_shape_fixed --input_name x --input_shape 1,4,64,64 model.onnx model_fixed1.onnx

python -m onnxruntime.tools.make_dynamic_shape_fixed --input_name context --input_shape 1,77,768 mode_fixed1.onnx model_fixed2.onnx (there is no encoder_hidden_states, only x, timesteps and context in my model, is it not 1.5??? Maybe its 1.4???)

python -m onnxruntime.tools.make_dynamic_shape_fixed --input_name timesteps --input_shape 1 model_fixed2.onnx model_fixed3.onnx

then i run Onnx Simplifier > go on the notebook and i get the usual bunch of files.

Now i have the Converted folder with inside model.txt and a lot on .bins
If i try to use it, i get the same error from before:

#20
I tried to fix it like it was suggested in the issue, but i end up having much more errors like that:
=== ERROR === read_file: unable to open file (/home/d00l1n/Desktop/Converted_A0/text_encoder_fp32/model.txt).

=== ERROR === read_file: unable to open file (/home/d00l1n/Desktop/Converted_A0/unet_fp16/model.txt).
I tried to fix them in the same way as the issue number 20 and it kinda worked?

Then i tried to put the converted files into the unet folder but i get this:
=== ERROR === Model::parse_tensor_string: invalid shape.

I moved my converted model into the unet_fp16, and used the other folders from the normal SD 1.5 in the Windows release of OnnxStream.
Im sure im doing a mess somewhere.

The command i use is this btw:
./sd --models-path ./Converted/ --prompt "space landscape" --steps 28 --rpi

Thanks again for the help and the patience!

image

from onnxstream.

vitoplantamura avatar vitoplantamura commented on May 18, 2024

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024

Sure, here is the output!
Unsqueeze -> 532
Cast -> 286
Mul -> 313
Cos -> 1
Sin -> 1
Concat -> 261
Gemm -> 24
Sigmoid -> 47
Conv -> 98
Reshape -> 441
InstanceNormalization -> 61
Add -> 265
Transpose -> 160
LayerNormalization -> 48
MatMul -> 158
Split -> 1
Einsum -> 64
Softmax -> 32
Shape -> 213
Gather -> 417
Div -> 125
Slice -> 32
Erf -> 16
Resize -> 3
TOTAL -> 3599

from onnxstream.

vitoplantamura avatar vitoplantamura commented on May 18, 2024

from onnxstream.

vitoplantamura avatar vitoplantamura commented on May 18, 2024

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024

Sorry for the late answer, i didn't have time.
So, i tried with the export_onnx.py modify, but i run into 1 problem: my laptop which has 6Gb of Vram and (this time) couldnt convert it (without the edit it can make it, idk why).
So i tried with my Desktop, but i got a Radeon VII so i had to make the extension run on CPU.

The script worked and converted it, but this time i didnt had onlt the .onnx but i got a .data aswell.
Owever, Onnx Simplifier and onnx2txt showed the "Shape" operator.

Unsqueeze -> 513 Mul -> 357 Cos -> 1 Sin -> 1 Concat -> 253 Gemm -> 24 Sigmoid -> 47 Conv -> 98 Reshape -> 441 InstanceNormalization -> 61 Add -> 361 Transpose -> 160 ReduceMean -> 96 Sub -> 48 Pow -> 48 Sqrt -> 48 Div -> 170 MatMul -> 156 Split -> 2 Einsum -> 64 Softmax -> 32 Shape -> 194 Gather -> 402 Slice -> 32 Erf -> 16 Resize -> 3 TOTAL -> 3628

So i retried converting using export_onnx with forcing the
x = torch.randn(1, 4, 16, 16).to(devices.device, devices.dtype)
timesteps = torch.zeros((1,)).to(devices.device, devices.dtype) + 500
context = torch.randn(1, 77, 768).to(devices.device, devices.dtype)
to the right values. (idk if was a good thing, but i wanted to test it myself before going back asking for help!)
BUT i got the same result, for some reason...

Now, im gonna upload the model so you can take a look.
Thanks a lot, really, you are helping me so much!

from onnxstream.

vitoplantamura avatar vitoplantamura commented on May 18, 2024

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024

Sorry it took me this much time to answer!
I was trying to get make it work on my Desktop Radeon VII but i got nothing if problems. Im waiting for a GPU on Colab (google not giving one to me for some time now) and ill get back to you!
Agai, sorry for the 5 day late reply!

from onnxstream.

vitoplantamura avatar vitoplantamura commented on May 18, 2024

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024

I did a pull request #36, if that is not ok i will do a separate guide for you to link! Thanks again!

from onnxstream.

elliot-sawyer avatar elliot-sawyer commented on May 18, 2024

@GaelicThunder, you provide an example on how to run the onnx2txt.ipynb script to convert a safetensors model to an onnx format suitable for a Pi4? I've been following along with the dreamshaper_8 model and while it does process to a dreamshaper_8.onnx file and a bunch of bias's and weights in a folder, it won't run on the pi because it's missing tokenizer/vocab.txt. onnx2txt seems to be the missing link there, but I'm not sure how to use the ipynb file to process it.

If you're not using Dreamshaper 8, it'd be good to see an example of how you got this to work using a model other than the base ones. Thanks!

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024

Hi @elliot-sawyer, sorry im a little lost, are you able to run onnx2txt? For the missing tokenizer/vocab.txt , have you extracted the base SD 1.5 from here? After you extract the base 1.5 SD weights and overwrite them with the one you get from onxx2txt. This should fix that error ( i say should cause the expert is Vito, not me, im just an enthusiast eheh )

from onnxstream.

elliot-sawyer avatar elliot-sawyer commented on May 18, 2024

I'm not sure how to run onnx2txt at all or even sure what it does. I'm just trying to use different models other than the base on my pi - the base SD 1.5 example is working perfectly :)

Thanks anyway for your help!

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024

Oh my, now i get it! Ok so, from the instructions:
This code can run a model defined in the path_to_model_folder/model.txt: (all the model operations are defined in the model.txt text file; OnnxStream expects to find all the weights files in that same folder, as a series of .bin files)/ The model.txt file contains all the model operations in ASCII format, as exported from the original ONNX file. Each line corresponds to an operation: for example this line represents a convolution in a quantized model: In order to export the model.txt file and its weights (as a series of .bin files) from an ONNX file for use in OnnxStream, a notebook (with a single cell) is provided (onnx2txt.ipynb).
Basically, you can open the ipynb on Colab or wherever you run your Notebook and change:

ONNX_FILENAME = "<ONNX_FILENAME.onnx>"
DEST_FOLDER = "<DEST_FOLDER>"

this to what suits your case.

The programm will output a bunch of files that you will have to put in the folder as i said earlier, on top of the other 1.5 SD.

Let me know how it goes, and tell me if you managed to do the Onnx Simplifier part, since you didn't talk about it!

from onnxstream.

elliot-sawyer avatar elliot-sawyer commented on May 18, 2024

Sweet! I'll another go with OnnxSimplifier. I did try it and if I recall it spit out some warnings, but a ton of files were generated. I believe the only thing missing was the vocab.txt file, which prevented it from running on the Pi. I wasn't very thorough recording my progress, so I'll have another crack and report back.

When you wrote these instructions, did you try it with a custom model other than the base SD models? If I try to follow the same one you used, I might be able to eventually spot some differences.

from onnxstream.

GaelicThunder avatar GaelicThunder commented on May 18, 2024

Yes i tried it with a custom model i made (since the normal SD1.5 Vito used was already trasformed and working and i wanted my custom anime s**t), was a merged one with some modifications, i think the link is down now tho, sorry! It should be working with any other SD1.5 model tho.
Also, i said OnnxSimplifier, howerver remember to follow the main guide as it can use up to ~100Gb of Swap Ram!

from onnxstream.

elliot-sawyer avatar elliot-sawyer commented on May 18, 2024

I had a go with it this evening and here's what I found:

  1. Trying to convert https://civitai.com/models/4384/dreamshaper, tried both v7 and v8
  2. Got an error: 'aten::scaled_dot_product_attention' to ONNX opset version 14 is not supported. Fixed with this snippet and changed to version 17

The option B steps do not work, it fails on the first command: onnx.onnx_cpp2py_export.checker.ValidationError: The model does not have an ir_version set properly.

The onnx_simplifier command shown in the readme doesn't work, for me through some searching I found this instead: python3 -m onnxsim dreamshaper8.onnx dreamshaper8_simplified.onnx

I get the same error when running that command:

Simplifying...
Traceback (most recent call last):
  File "/usr/lib/python3.10/runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/usr/lib/python3.10/runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "/path/to/local/lib/python3.10/site-packages/onnxsim/__main__.py", line 5, in <module>
    main()
  File "/path/to/local/lib/python3.10/site-packages/onnxsim/onnx_simplifier.py", line 489, in main
    model_opt, check_ok = simplify(
  File "/path/to/.local/lib/python3.10/site-packages/onnxsim/onnx_simplifier.py", line 199, in simplify
    model_opt_bytes = C.simplify(
onnx.onnx_cpp2py_export.checker.ValidationError: The model does not have an ir_version set properly.

I might try with another model entirely to see if that makes a difference

from onnxstream.

elliot-sawyer avatar elliot-sawyer commented on May 18, 2024

@vitoplantamura I've tried a few different models, the provided python script all seem to return the same error consistently: onnx.onnx_cpp2py_export.checker.ValidationError: The model does not have an ir_version set properly. Any feedback is very appreciated!

from onnxstream.

thexa4 avatar thexa4 commented on May 18, 2024

I had this happen too, it was caused by the model being saved using external data. Using this simplifier worked: https://github.com/luchangli03/onnxsim_large_model

from onnxstream.

noah003 avatar noah003 commented on May 18, 2024

have you solved your problems? @elliot-sawyer

from onnxstream.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.