GithubHelp home page GithubHelp logo

Comments (19)

rjmacarthy avatar rjmacarthy commented on May 26, 2024 2

I'll update the readme soon with Ollama web UI support. Many thanks for the help and the donation too!

from twinny.

kha84 avatar kha84 commented on May 26, 2024 1

I'm hitting the same issue, confirmed the bearer token is working with Postman. I've enabled "debug" in Twinny settings but can't seem to find the output anywhere, my Ollama server does not see any requests being made by Twinny.

I struggled with the same question. Here is how to view debug - #72 (comment)

Give it another try with your ollama web ui, but update the Twinny first to the latest version.
@rjmacarthy recently made some changes so Twinny now supports oobabooga, it might be the case that integration with ollama web ui was also fixed.

from twinny.

kha84 avatar kha84 commented on May 26, 2024 1

I'm not the author and no JS programmer myself, but based on a error it looks like the ollama-web-ui OpenAI API endpoint timeouts. Are you able to get responses if you send same request with using curl? See the #72 for examples how I did that for oobabooga/textgen-webui
Paste your output from running curl here, it might help @rjmacarthy to pin that down.

It looks like every inference software who claims to be OpenAI-API-compatible is still doing some things differently from each other.

from twinny.

kha84 avatar kha84 commented on May 26, 2024 1

Yeah, you see, in my example (for Jan.ai) the JSON for streamed responses looked different #72 (comment)

I'm pretty sure that different software wrap streamed responces into JSON differently. To support them all plugin needs to expose some configuration setting, a sort of path inside of JSON, where you can hint it, where exactly to look at for characters in the responded JSON and when the response is over. I hope @rjmacarthy will figure it out how it can be done, so all the users of different inference apps will get their silver bullet

from twinny.

dlford avatar dlford commented on May 26, 2024 1

Haha of course, it would be too easy if everyone just agreed on a standard and stuck with it 😂.

Thanks @kha84 for your insights!

@rjmacarthy please let me know if you need anything else from me on this issue.

from twinny.

rjmacarthy avatar rjmacarthy commented on May 26, 2024 1

Hey I did some debugging, but I am really not sure why it's not working right now, sorry.

from twinny.

rjmacarthy avatar rjmacarthy commented on May 26, 2024 1

Update: I think it's fixed now...I just tested it and it seems to work as long as the auth credentials are correctly set. Please test on the latest version and let me know, thanks.

from twinny.

dlford avatar dlford commented on May 26, 2024 1

@rjmacarthy that did, everything is working now, thank you!

from twinny.

technovangelist avatar technovangelist commented on May 26, 2024

why would you use the webui? Just use Ollama directly.

from twinny.

rjmacarthy avatar rjmacarthy commented on May 26, 2024

I'm not sure what could be causing it, but can look into it. Thanks for the report.

from twinny.

dlford avatar dlford commented on May 26, 2024

I'm hitting the same issue, confirmed the bearer token is working with Postman. I've enabled "debug" in Twinny settings but can't seem to find the output anywhere, my Ollama server does not see any requests being made by Twinny.

from twinny.

dlford avatar dlford commented on May 26, 2024

why would you use the webui? Just use Ollama directly.

I can't speak for the person who opened this issue, but I use the webui to secure ollama because it's exposed to the internet.

from twinny.

dlford avatar dlford commented on May 26, 2024

@kha84 oh I see, the logs are in the webview devtools console.

I did some more testing today and I think the bearer token is working, as the wrong token gets a 401 response as it should. But using the correct token this is what I get:

ksnip_20240211-135547

from twinny.

dlford avatar dlford commented on May 26, 2024

Interestingly, I only see the request in the ollama and webui pods when the token is invalid (401 unauthorized), when using the correct token there is no output from either in the logs.

from twinny.

dlford avatar dlford commented on May 26, 2024

Even more interestingly, using the Twinny chat sidebar completely hangs the Ollama server, it will not respond to any generate or chat requests after that has been attepmted until I restart the container it's running in.

Here is the output of the curl request, seems to work fine and not cause any hangs:

ksnip_20240211-160450

Same exact results with the code completion as the chat sidebar. curl works, but Twinny times out and hangs the Ollama server without any log output from the container.

@rjmacarthy

from twinny.

dlford avatar dlford commented on May 26, 2024

Twinny Settings:

{
  "twinny.apiHostname": "ollama.redacted-hostname.tld",
  "twinny.chatApiPort": 443,
  "twinny.fimApiPort": 443,
  "twinny.useTls": true,
  "twinny.apiBearerToken": "REDACTED",
  "twinny.chatApiPath": "/ollama/api/generate",
  "twinny.fimApiPath": "/ollama/api/generate",
  "twinny.enableLogging": true
}

from twinny.

dlford avatar dlford commented on May 26, 2024

@rjmacarthy any way I can help? I can add a user for you on my Ollama server for debugging?

from twinny.

rjmacarthy avatar rjmacarthy commented on May 26, 2024

Hey @dlford I think it's due to ollama webui backend supporting text/event-stream with stream: false for the content-type request header and all of the other API's which are supported use application/json and stream: true. I ran everything locally even the ollama webui server and requests we're working after I hacked authentication and cors but we're ending pre-maturely due to content-length errors, so I need more time or help.

from twinny.

dlford avatar dlford commented on May 26, 2024

@rjmacarthy that's weird, I'll try to find some time to explore the code base and tinker a bit. I appreciate you!

from twinny.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.