Comments (14)
Hi @depuytnl,
Ollama is running on 10,1.1.8:11434 and I can use the REST API, but when using the webui, I get no answer on any request.
Currently, the Ollama WebUI requires Ollama to be accessible from the same IP address as the Ollama WebUI. To address this issue, I'll be adding a feature that allows you to edit the Ollama address via an environment variable soon.
Thank you for your patience, and I'll keep you posted on the progress of this feature. Feel free to reach out if you have any further questions or concerns.
from ollama-webui.
I'm glad to hear that the issue has been resolved and that it's now working for you. Regarding the Ollama docker image, you could try adding --net=host to make the container appear as if it's running on the host itself. The command would look something like this:
docker run -d --gpus device=5 -v ollama:/root/.ollama --net=host --name ollama ollama/ollama
I haven't personally tested the command, so I can't guarantee it will work. Please let me know if the above command works for you.
Feel free to reach out if you need further assistance.
Hi,
no that hasn't solved it. I have posted the question on how to assign specific gpus to ollama serve in the ollama issue tracker. This would be a solution.
I will experiment with docker settings as well. If I find a solution I will post it to this thread.
Thanks for the great support!
from ollama-webui.
Hi Alexander,
Could you verify if the Ollama server is running on http://127.0.0.1:11434/?
Thanks.
from ollama-webui.
yes. curl http://127.0.0.1:11434 answers "ollama is running"
from ollama-webui.
Hi there,
Are you using Windows by any chance? If so, please try adding the "--add-host=host.docker.internal:host-gateway" to the docker command. I've recently updated the run.sh file, so kindly pull the latest code and test it out. Let me know if it resolves the issue.
Alternatively, if the aforementioned solution doesn't work, you can try running Ollama with the following configuration:
OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve
Keep me posted on your progress.
Thanks!
from ollama-webui.
Hi,
the docker "--add-host=..." param seem to have solved the 500 error. I can also confirm, that the 500 error re-appears when ollama is not running.
Running ollama with the OLLAMA_HOST enviroment variable set as recommended also made everything work.
The only issue that I'm still having is, that it does not work with the ollama docker image. I'm running the container using the following command:
docker run -d --gpus device=5 -v ollama:/root/.ollama -p 0.0.0.0:11434:11434 --name ollama ollama/ollama
The GUI is running, I can select models. But when I submit a prompt, the GUI shows that it is being processed, but nothing else happens. It just hangs indefinitely.
Actually, I don't rely on Docker. But I need to restrict ollama to a specific GPU, otherwise it would just grab all available GPUs in the server. Do you know how to parameterize ollama to use a specific GPU?
from ollama-webui.
I also struggle. I run the container with the below command. Ollama is running on 10,1.1.8:11434 and I can use the REST API, but when using the webui, I get no answer on any request.
docker run -d -p 4444:3000 --add-host=host.docker.internal:10.1.1.8 --name ollama-webui --restart always ollama-webui
from ollama-webui.
The only issue that I'm still having is, that it does not work with the ollama docker image. I'm running the container using the following command:
Hi,
I'm glad to hear that the issue has been resolved and that it's now working for you. Regarding the Ollama docker image, you could try adding --net=host to make the container appear as if it's running on the host itself. The command would look something like this:
docker run -d --gpus device=5 -v ollama:/root/.ollama --net=host --name ollama ollama/ollama
I haven't personally tested the command, so I can't guarantee it will work. Please let me know if the above command works for you.
Feel free to reach out if you need further assistance.
from ollama-webui.
Hi @depuytnl,
I have just implemented an environment variable that allows you to connect to the model when Ollama is hosted on a different server. You can utilize the environment variable -e OLLAMA_ENDPOINT="http://[insert your Ollama address]" to establish the connection.
For your specific use case, the following code should work:
docker build -t ollama-webui .
docker run -d -p 4444:3000 --add-host=host.docker.internal:host-gateway -e OLLAMA_ENDPOINT="http://10.1.1.8:11434" --name ollama-webui --restart always ollama-webui
Feel free to test it out, and please let me know if you encounter any issues or if you have any other questions.
from ollama-webui.
Thanks for that very quick update. I have just tried it, but sadly the result is the same. It doesn't output any errors, is there anything I can switch on to provide better feedback? Thanks again!
from ollama-webui.
I'm sorry to hear that the environment variable didn't work as expected. Could you please provide the logs by using the following command:
docker logs ollama-webui
This should provide us with all the logs to help us understand what might be causing the issue.
Additionally, were you able to run Ollama with the following command successfully?
OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve
Please let me know the outcome of these steps so I can assist you further.
from ollama-webui.
I also just noticed that one of the files wasn't pushed to the repository. Could you please try the same command with the latest release of the repository by pulling the latest changes?
Let me know if this resolves the issue or if you need any further assistance.
Thanks.
from ollama-webui.
I just tried with the latest release, but still get the same result. The logs have the following content:
Listening on 0.0.0.0:3000
http://10.1.1.8:11434
I'm running the ollama container as provided by that project, so not sure how I could change the command. I did however run the below commands from inside the ollama-webui container, which I suspect show that the ollama rest interface is reachable from the ollama-webui container.
set
ENV='prod'
HOME='/root'
HOSTNAME='90eee2e42f0c'
IFS='
'
NODE_VERSION='20.8.1'
OLLAMA_ENDPOINT='http://10.1.1.8:11434'
OPTIND='1'
PATH='/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin'
PPID='0'
PS1='# '
PS2='> '
PS4='+ '
PWD='/app'
TERM='xterm'
YARN_VERSION='1.22.19'
curl http://10.1.1.8:11434
Ollama is running
from ollama-webui.
Thank you for sharing the logs and the details of the commands you ran within the ollama-webui container. Since the Ollama server appears to be reachable from the container, the issue is likely related to CORS.
If you were able to access the main page, it indicates that CORS might indeed be causing the problem. To help us further diagnose the issue, could you please provide a screenshot of your console logs from the browser's developer tools?
Additionally, to enable CORS from the Ollama server, it is necessary to run the following command:
OLLAMA_HOST=0.0.0.0 OLLAMA_ORIGINS=* ollama serve
Kindly execute this command for the Ollama server. After making these changes, please attempt to access the Ollama WebUI again to check if the issue is resolved.
Thanks.
from ollama-webui.
Related Issues (20)
- enh: Better CJK language support when downloading pdf HOT 3
- Move chat completing logic from frontend to backend
- [Object] Object Error
- A TTS engine
- Upload file error Reshape
- macOS PWA Launchpad/Dock icon is a square, not a squircle HOT 8
- Not able to start Docker Container | USER_AGENT environment variable not set, consider setting it to identify your requests.
- [Markdown BUG] 🤖 Cannot render correctly HOT 3
- Lack of Markdown Support in User Message Input
- More variables in chat and system prompt
- Negative limit for chat lists does not exist (v0.3.12)
- Feature request: Allow users to add their own unique openai api keys via the web ui
- UI not starting
- ’/‘和‘//’有错误
- Use GPU in Whisper local
- Internal http requests make server hang
- Unable to distinguish when having duplicate models
- Mathematical formulas cannot be displayed normally. HOT 15
- All response are returning blank brackets
- enh: Non-linear conversations
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ollama-webui.