Comments (3)
Hi! As of now, the WebLLM project only supports devices/browsers with WebGPU support (support can be checked via https://webgpureport.org/).
While we have not tried on Samsung Note20 (only tested on Samsung S23 and Pixel 7 Pro), I would suggest try upgrading Android version and updating Chrome to the latest version, as these may affect WebGPU support. Can be verified via https://webgpureport.org/ after upgrades.
If with most up-to-date versions, the device still does not support WebGPU, you can also check out MLC-LLM (WebLLM is a companion project of MLC-LLM), which has Android support: https://llm.mlc.ai/docs/deploy/android.html
from web-llm.
I did manage
To squeeze a few tokens per second but
These tiny models are still far from production ready.
Thanks a lot for the advice, wonder if we will see capable 1-bit llms soon, or other tiny but coherent models.
I'll stay tuned... All the best! 🙏🧘♀️
from web-llm.
glad it works
from web-llm.
Related Issues (20)
- Not working HOT 2
- try to run gemma-7b but failed HOT 1
- simple-chat: error during loading params onto WebGPU, GPUPipelineError: A valid external Instance reference no longer exists HOT 3
- Simple-chat is successfully deployed on the PC, an error occurs:Init error, TypeError: crypto.randomUUlD is not a function when I accesses the web port using the smart phone browser HOT 1
- ES module build of web-llm
- Error: Unknown conv template gpt2 when using prebuild gpt2 in examples/get-started HOT 2
- [Announcement] Breaking changes regarding conversation template HOT 1
- CLI run on MacOS error "unrecognized arguments: --allreduce_strategy=RING" HOT 1
- Model request: new Mistral 7B with 32K context HOT 1
- [Vue/Vite/Nuxt] Build Error: require is not defined in ES module scope, you can use import instead HOT 2
- Gibberish output with `Llama-2-7b-chat-hf-q4f32_1` HOT 17
- Add support for Gemma 7B
- Link in main readme doesn't work HOT 1
- Error running the function calling example: Cannot find global function mlc.serve.BNFGrammarGetGrammarOfJSON HOT 3
- BufferSource argument is empty HOT 4
- [MLC-LLM] Uncaught (in promise) LinkError: WebAssembly.instantiate(): Import #4 "env" HOT 2
- IndexedDB cache fails like the caches HOT 6
- Create a simpler web-workers example HOT 2
- Error: Cannot find global function tvmjs.runtime.ArrayConcat HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from web-llm.