Comments (30)
from onnxstream.
Thanks for your reply ill try to build it today .
I tried to build it on google colab xnnpack was build successful but i was getting error in building stable diffusion example
from onnxstream.
from onnxstream.
I build with clang 16
error
ld.lld: error: undefined symbol: __android_log_vprint
fixed with
LDFLAGS=-llog cmake -DXNNPACK_DIR=$HOME/XNNPACK ..
next error
error: "./sd": executable's TLS segment is underaligned: alignment is 8, needs to be at least 64 for ARM64 Bionic
fixed with
termux-elf-cleaner sd
next error
Segmentation fault
will try to fix later
from onnxstream.
https://github.com/Fcucgvhhhvjv/Android-Stable-diffusion-ONNX/blob/master/Untitled2.ipynb
here u can check whats wrong
i tried to use commit id and use it with git checkout id but it didn't work aswell.
from onnxstream.
@Fcucgvhhhvjv
git checkout $(git rev-list -n 1 --before="2023-06-27 00:00" master)
from onnxstream.
from onnxstream.
from onnxstream.
@vitoplantamura whith which CPU?
from onnxstream.
from onnxstream.
I successfully got it built in termux proot.
Are u guys trying in proot environment or just termux?
from onnxstream.
How to make it faster? By default 1 step takes 3-4 min .
from onnxstream.
Direct building in termux gives this error
clang-16: error: no such file or directory: '/XNNPACK/build/libXNNPACK.a' clang-16: error: no such file or directory: '/XNNPACK/build/pthreadpool/libpthreadpool.a' clang-16: error: no such file or directory: '~/XNNPACK/build/cpuinfo/libcpuinfo.a'
make[2]: *** [CMakeFiles/sd.dir/build.make:116: sd] Error 1 make[1]: *** [CMakeFiles/Makefile2:83: CMakeFiles/sd.dir/all] Error 2 make: *** [Makefile:91: all] Error 2
from onnxstream.
@Fcucgvhhhvjv
cmake -DXNNPACK_DIR=$HOME/XNNPACK ..
cmake --build . --config Release
from onnxstream.
It is a Samsung Galaxy Tab S8 Plus. Maybe you could give it a try by editing CMakeLists.txt instead of setting LDFLAGS, Vito
something was wrong with my clang, reinstalling everething helped me
from onnxstream.
ya i used it to build in termux proot
but termux environment doesnt work
from onnxstream.
I compiled in fresh clean termux without problems, only add link_libraries("log") to CMakeLists.txt
Because of core parking not all cores are used by default.
from onnxstream.
where do i add this ?
link_libraries("log") anywhere ? or at the very end
I compiled in fresh clean termux without problems, only add link_libraries("log") to CMakeLists.txt
Because of core parking not all cores are used by default.
from onnxstream.
this is the log
/.../src/build $ cmake --build . --config Release/XNNPACK/build/libXNNPACK.a'
[ 33%] Linking CXX executable sd
clang-16: error: no such file or directory: '
clang-16: error: no such file or directory: '/XNNPACK/build/pthreadpool/libpthreadpool.a'/XNNPACK/build/cpuinfo/libcpuinfo.a'
clang-16: error: no such file or directory: '
make[2]: *** [CMakeFiles/sd.dir/build.make:116: sd] Error 1
make[1]: *** [CMakeFiles/Makefile2:83: CMakeFiles/sd.dir/all] Error 2
make: *** [Makefile:91: all] Error 2
~/.../src/build $
and this is my cmakelist.txt
GNU nano 7.2 CMakeCache.txt Modified
This is the CMakeCache file.
For build in directory: /data/data/com.termux/files/home/OnnxStream/src/build
It was generated by CMake: /data/data/com.termux/files/usr/bin/cmake
You can edit this file to change values found and used by cmake.
If you do not want to change any of the values, simply exit the editor.
If you do want to change a value, simply edit, save, and exit the editor.
The syntax for the file is as follows:
KEY:TYPE=VALUE
KEY is the name of a variable in the cache.
TYPE is a hint to GUIs for the type of VALUE, DO NOT EDIT TYPE!.
VALUE is the current value for the KEY.
########################
EXTERNAL cache entries
########################
link_libraries("log")
//Path to a program.
CMAKE_ADDR2LINE:FILEPATH=/data/data/com.termux/files/usr/bin/llvm-addr2line
//Path to a program.
CMAKE_AR:FILEPATH=/data/data/com.termux/files/usr/bin/llvm-ar
//Choose the type of build, options are: None Debug Release RelWithDebInfo
// MinSizeRel ...
CMAKE_BUILD_TYPE:STRING=RelWithDebInfo
and the error
~/.../src/build $ cmake --build . --config Release
CMake Error: Parse error in cache file /data/data/com.termux/files/home/OnnxStream/src/build/CMakeCache.txt on line 385. Offending entry: link_libraries("log")
CMake Error: Parse error in cache file /data/data/com.termux/files/home/OnnxStream/src/build/CMakeCache.txt on line 385. Offending entry: link_libraries("log")
-- Configuring incomplete, errors occurred!
make: *** [Makefile:206: cmake_check_build_system] Error 1
from onnxstream.
@Fcucgvhhhvjv you should compile XNNPACK first
link_libraries("log") add to OnnxStream/src/CMakeLists.txt
right below
link_libraries("pthread")
from onnxstream.
thanks i was able to build it successfully
from onnxstream.
i have lots of onnx quantized int 4 fp16 onnx model in my hugging repo which works for this repo
https://github.com/ZTMIDGO/Android-Stable-diffusion-ONNX/ .
How can i use those models with this repo ?
let me know , ill provide a link to my repo where i have the models
from onnxstream.
from onnxstream.
@vitoplantamura
Why this command faisl to bind library and doesnt build the binary in termux
cmake -DMAX_SPEED=ON -DXNNPACK_DIR=<DIRECTORY_WHERE_XNNPACK_WAS_CLONED> ..
cmake --build . --config Release
But this works
cmake -DXNNPACK_DIR=$HOME/XNNPACK ..
cmake --build . - -config Release
I want to check if DMAX_SPEED=ON makes any difference.
Thank you
from onnxstream.
@Fcucgvhhhvjv
I compiled with glibc gcc (termux-pacman glibc repo) and there's no improvment vs standart termux's clang
from onnxstream.
@vitoplantamura thanks for testing it out .
Also original linux environment should be faster than termux right?
Does the inference speed depend highly upon no of threads? I have noticed slow speed in hugging face jupyter notebook , google colab and kaggle kernal .
I will test with speed mode as off to see if it is any better.
from onnxstream.
from onnxstream.
@romanovj
with cmake in termux environment on android 13 SD 860
i got around 40 second per iteration . So inference is like 3-5 minutes with ./sd .
On termux proot ubuntu i got around 2-3 minutes per iteration , same with google colab kaggle and hugging face jupyter space .
I still have to test how it performs with speed mode off in colab, huggingface spaces and kaggle.
from onnxstream.
interesting
Ohh that explains it , colab and other cpu provider have 4 threads at max .
from onnxstream.
update
my benchmarks
model from releases, Snapdragon 662, 4GB RAM, LOS 20 GSI (A13)
command:
sd --rpi
standart termux (system's bionic libc, clang 16)
19m56s
glibc termux enviroment (glibc libc, gcc 13.2 + DMAX_SPEED=ON)
18m3s
above + LD_PRELOAD libjemalloc.so
16m51s
gcc + DMAX_SPEED=ON + libjemalloc.so +openmp
16m03s
full cmd
MALLOC_CONF="oversize_threshold:1,background_thread:true,metadata_thp:always,thp:always,narenas:2,dirty_decay_ms:10000,muzzy_decay_ms:10000" OMP_NUM_THREADS=2 LD_PRELOAD=/data/data/com.termux/files/usr/glibc/lib/libjemalloc.so ./sd --rpi
from onnxstream.
Related Issues (20)
- Error build HOT 8
- [feature request] Whisper with openblas HOT 3
- [question] your Mistral-to-ONNX export script HOT 2
- [question] why to convert model from ONNX to TXT format? HOT 1
- Username and password required? HOT 1
- How open are you to code-refactoring Pull Requests? HOT 2
- Stable Diffisuon 1.5 installation from? HOT 3
- changing image size HOT 2
- UnsupportedOperatorError: Exporting the operator ::_transformer_encoder_layer_fwd to ONNX opset HOT 2
- Tensor memory layout HOT 7
- Add support for 1 step diffusion (DMD) HOT 1
- vae_decoder_qu8/range_data.txt value generation HOT 1
- Pi Zero 2w 32bit build of xnnpack failing HOT 7
- Onnxstream won't build HOT 2
- 403 error HOT 1
- Safetensors? HOT 4
- Any updates in generating dynamic shape images? HOT 3
- Is there some methods to directly load the negative prompt embedding? HOT 2
- Is there an error in "Cast" operator? HOT 2
- Change resolution? HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from onnxstream.