Comments (19)
I'll update the readme soon with Ollama web UI support. Many thanks for the help and the donation too!
from twinny.
I'm hitting the same issue, confirmed the bearer token is working with Postman. I've enabled "debug" in Twinny settings but can't seem to find the output anywhere, my Ollama server does not see any requests being made by Twinny.
I struggled with the same question. Here is how to view debug - #72 (comment)
Give it another try with your ollama web ui, but update the Twinny first to the latest version.
@rjmacarthy recently made some changes so Twinny now supports oobabooga, it might be the case that integration with ollama web ui was also fixed.
from twinny.
I'm not the author and no JS programmer myself, but based on a error it looks like the ollama-web-ui OpenAI API endpoint timeouts. Are you able to get responses if you send same request with using curl? See the #72 for examples how I did that for oobabooga/textgen-webui
Paste your output from running curl here, it might help @rjmacarthy to pin that down.
It looks like every inference software who claims to be OpenAI-API-compatible is still doing some things differently from each other.
from twinny.
Yeah, you see, in my example (for Jan.ai) the JSON for streamed responses looked different #72 (comment)
I'm pretty sure that different software wrap streamed responces into JSON differently. To support them all plugin needs to expose some configuration setting, a sort of path inside of JSON, where you can hint it, where exactly to look at for characters in the responded JSON and when the response is over. I hope @rjmacarthy will figure it out how it can be done, so all the users of different inference apps will get their silver bullet
from twinny.
Haha of course, it would be too easy if everyone just agreed on a standard and stuck with it 😂.
Thanks @kha84 for your insights!
@rjmacarthy please let me know if you need anything else from me on this issue.
from twinny.
Hey I did some debugging, but I am really not sure why it's not working right now, sorry.
from twinny.
Update: I think it's fixed now...I just tested it and it seems to work as long as the auth credentials are correctly set. Please test on the latest version and let me know, thanks.
from twinny.
@rjmacarthy that did, everything is working now, thank you!
from twinny.
why would you use the webui? Just use Ollama directly.
from twinny.
I'm not sure what could be causing it, but can look into it. Thanks for the report.
from twinny.
I'm hitting the same issue, confirmed the bearer token is working with Postman. I've enabled "debug" in Twinny settings but can't seem to find the output anywhere, my Ollama server does not see any requests being made by Twinny.
from twinny.
why would you use the webui? Just use Ollama directly.
I can't speak for the person who opened this issue, but I use the webui to secure ollama because it's exposed to the internet.
from twinny.
@kha84 oh I see, the logs are in the webview devtools console.
I did some more testing today and I think the bearer token is working, as the wrong token gets a 401
response as it should. But using the correct token this is what I get:
from twinny.
Interestingly, I only see the request in the ollama and webui pods when the token is invalid (401 unauthorized), when using the correct token there is no output from either in the logs.
from twinny.
Even more interestingly, using the Twinny chat sidebar completely hangs the Ollama server, it will not respond to any generate
or chat
requests after that has been attepmted until I restart the container it's running in.
Here is the output of the curl
request, seems to work fine and not cause any hangs:
Same exact results with the code completion as the chat sidebar. curl
works, but Twinny times out and hangs the Ollama server without any log output from the container.
from twinny.
Twinny Settings:
{
"twinny.apiHostname": "ollama.redacted-hostname.tld",
"twinny.chatApiPort": 443,
"twinny.fimApiPort": 443,
"twinny.useTls": true,
"twinny.apiBearerToken": "REDACTED",
"twinny.chatApiPath": "/ollama/api/generate",
"twinny.fimApiPath": "/ollama/api/generate",
"twinny.enableLogging": true
}
from twinny.
@rjmacarthy any way I can help? I can add a user for you on my Ollama server for debugging?
from twinny.
Hey @dlford I think it's due to ollama webui backend supporting text/event-stream
with stream: false
for the content-type
request header and all of the other API's which are supported use application/json
and stream: true
. I ran everything locally even the ollama webui server and requests we're working after I hacked authentication and cors but we're ending pre-maturely due to content-length errors, so I need more time or help.
from twinny.
@rjmacarthy that's weird, I'll try to find some time to explore the code base and tinker a bit. I appreciate you!
from twinny.
Related Issues (20)
- Chat API endpoint incorrect for Ollama in default settings HOT 1
- Fetch error: Error: Server responded with status code: 404 HOT 1
- FIM requests not being sent / recieved at Ollama API HOT 1
- Add support to Deci-based models HOT 1
- Support for web version of vscode? HOT 3
- Error parsing JSON: TypeError: Cannot read properties of undefined (reading '0') HOT 3
- Oobabooga vs. Twinny HOT 11
- et API Bearer Token is not working HOT 2
- Enhance Twinny with LiteLLM (and indirectly OpenRouter) Support HOT 4
- Default ollama `Chat Api Path` points to the wrong URL path HOT 2
- Cannot chat successfully with ollama HOT 2
- No robot icon, no completion HOT 3
- [Feature] Jetbrains plugins HOT 1
- Ideal setup of parallel chat and fim models HOT 2
- Possibly my mistake, but I keep getting this error HOT 2
- Support Comments Translation HOT 5
- Different shortcuts for single-line or multi-line suggestions HOT 2
- Outputs only "undefined" HOT 8
- When using the vscode twinny plugin with remote-ssh for remote development, code suggestions are not working. HOT 1
- something wrong with the new update no ability to add model name. what could the issue be? HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from twinny.