Comments (6)
Yes, that does seem to work! Thank you.
Here is an example of my compose file I used:
version: '3.3'
services:
ollama:
container_name: ollama
volumes:
- "/your/path:/root/.ollama"
restart: unless-stopped
image: ollama/ollama
ollama-webui:
container_name: ollama-webui
environment:
- "OLLAMA_API_BASE_URL=http://ollama:11434/api"
depends_on:
- ollama
ports:
- 8080:8080
restart: unless-stopped
image: ghcr.io/ollama-webui/ollama-webui:main
from ollama-webui.
Hi, Just merged the dev branch with the backend reverse proxy feature. Please take a look at the setup instructions as they have been also updated. Let me know if you're experiencing any issues. Thanks!
from ollama-webui.
@tjbck Thanks for the change! While this gets closer to the desired result, I think there might be some problems with this approach.
In order to use - "host.docker.internal:host-gateway"
which changes the "Ollama Server URL" to /ollama/api
the ollama API port still needs to be exposed outside the container stack. This is problematic as users might not want to expose the olllama API outside the stack at all. I've included a docker compose example below.
When two services are in the same stack, they can utilize the built-in docker DNS to resolve each others hostname. So in the below example, ollama-webui can connect directly to ollama using the "ollama" DNS name. For example, if you start an interactive shell on the ollama-webui container, you will be able to connect directly to ollama API inside the stack with "http://ollama:11434/api" without needing to expose the API port in the compose file.
If I comment out the port being exposed, the webui connection to the API fails:
version: '3.3'
services:
ollama:
container_name: ollama
volumes:
- '/your/path:/root/.ollama'
#ports:
# - '11434:11434' # optional - for exposing the API outside the container stack
restart: unless-stopped
image: ollama/ollama
ollama-webui:
container_name: ollama-webui
extra_hosts:
- "host.docker.internal:host-gateway"
#environment:
#- OLLAMA_API_BASE_URL='http://ollama.example.com/api' # optional - for when ollama is on a remote host
depends_on:
- ollama
ports:
- '8080:8080'
restart: unless-stopped
image: ghcr.io/ollama-webui/ollama-webui:main
Possible solutions:
- Some environment variable where the internal hostname can be specified, like
<variable_name>='http://ollama:11434/api'
- Re-working the existing variable "OLLAMA_API_BASE_URL" so it can leverage the docker internal DNS?
Thanks!
from ollama-webui.
Here are some test results I ran if it helps:
jump into interactive shell on ollama-webui container:
docker exec -t -i ollama-webui /bin/bash
install dig, ping and curl:
apt update
apt install dnsutils
apt install iputils-ping
apt install curl
DNS for ollama resolves as expected - providing the internal IP of ollama service inside the stack:
root@2b0c83a687fd:/app/backend# dig ollama
; <<>> DiG 9.11.5-P4-5.1+deb10u9-Debian <<>> ollama
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: NOERROR, id: 5796
;; flags: qr rd ra; QUERY: 1, ANSWER: 1, AUTHORITY: 0, ADDITIONAL: 0
;; QUESTION SECTION:
;ollama. IN A
;; ANSWER SECTION:
ollama. 600 IN A 172.22.0.2
;; Query time: 0 msec
;; SERVER: 127.0.0.11#53(127.0.0.11)
;; WHEN: Thu Nov 16 23:07:35 UTC 2023
;; MSG SIZE rcvd: 46
pinging ollama works:
root@2b0c83a687fd:/app/backend# ping -c 1 ollama
PING ollama (172.22.0.2) 56(84) bytes of data.
64 bytes from ollama.ollama_default (172.22.0.2): icmp_seq=1 ttl=64 time=0.034 ms
--- ollama ping statistics ---
1 packets transmitted, 1 received, 0% packet loss, time 0ms
rtt min/avg/max/mdev = 0.034/0.034/0.034/0.000 ms
curl test to API works:
root@2b0c83a687fd:/app/backend# curl -X POST http://ollama:11434/api/generate -d '{
"model": "mistral",
"prompt":"Why is the sky blue?"
}'
<expected json output>
from ollama-webui.
This looks promising, I'll take a look and update the docker compose + backend files accordingly. In the meantime, PR is always welcome. Thanks!
from ollama-webui.
Now that I'm taking a second look, you should be able to leverage the OLLAMA_API_BASE_URL environment variable to connect to Ollama without exposing the port externally. Could you please try setting the environment variable to OLLAMA_API_BASE_URL=http://ollama:11434/api
and see if it functions as you intended?
Regardless of your result, I'll still update the docker compose file for enhanced security, Thanks!
from ollama-webui.
Related Issues (20)
- enh: Include documents into model setup (system prompt/...) HOT 3
- Stable Diffusion extra Parameters
- Whisper OpenVino and Whisper.cpp API!
- [BUG] PDF Including Image -- Error wrong :/ 500: The content provided is empty. Please ensure that there is text or data present before proceeding. HOT 2
- [BUG]web UI *.doc(word97 - 2003) Something went wrong :/ "There is no item named 'word/document.xml' in the archive"
- Dynamic Scaling for Multiple Agents in Chat Conversations
- enh: "413 Request Entity Too Large" not handled well, response HTML treated as JSON HOT 2
- Need a way for LLMs to know the current date
- Integrating Cloudflare Workers AI into OpenWebUI HOT 4
- LmStudio integration
- Model responds with "you didn't provide an image" or equivalent
- feat(config): Set Ollama Keep_Active parameter for all users in Admin Settings
- Feature request: Task Model to use the same LLM parameters as in Workspaces HOT 1
- After reverse proxying with Nginx over HTTPS, the chat in Chrome browser fails and reports a JSON format error.
- feat: Add support for Anthropic API (Claude etc) HOT 1
- static resource access problem while deploy open-webui using the k8s platform with gateway
- feat: Allow admins to brand their site HOT 1
- 是否可以支持接入智谱AI
- chat request timesout after 60s, and context lenght not detected automatically
- API to interrogate model/chat
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ollama-webui.