Comments (6)
Thanks @kv-gits for the suggestion, it's possible. Do you know which model would support FIM completions using llama.cpp? Edit I am working on this in #47
from twinny.
Thanks for that I am creating a new release now version 3 which adds support for llama.cpp. You'll need to update some settings in the extension for it to work. FIM is working with the model I tried, but with the same model chat is spitting out garbage, there is an option to use a different endpoint for chat if required.
from twinny.
I just released version v3.0.0
which includes support for llama.cpp using the --server
option. The host, port and endpoints need to be configured in the extension settings. Please let me know how you get on.
Many thanks
from twinny.
As I found in issues, llama.cpp supports FIM tokens and all three models of tweeny have their gguf version on hf
https://github.com/ggerganov/llama.cpp/tree/master/examples/infill
from twinny.
I think there was a small bug which still checked for model availability on Ollamo on v3.0.0
I fixed it and released v3.0.2
to stop this check.
from twinny.
Would be nice to turn off "install ollama" window after install, and also provide some example cmd to run llamacpp server for quick start
from twinny.
Related Issues (20)
- Unable to select model when using Ollama HOT 3
- command 'twinny.showSidebar' not found HOT 1
- Change shortcuts for suggestions HOT 2
- Improve documentation for FIM HOT 2
- How to configure proxy? HOT 1
- Incomplete Code Autocompletion and Non-Responsive Chat UI in Twinny Extension HOT 5
- feat: open new chat window in new editor tab HOT 1
- Edit and re-submit in chat mode HOT 4
- Robot icon keeps spinning, no inference HOT 1
- Code completion not working HOT 8
- Code completion works, but chat just spins the progress circle indefinitely HOT 2
- Configured providers but twinny not sending any requests to provider. HOT 3
- Multiline completion is confusing HOT 1
- Code snippets in the chat window loose syntax highlighting occasionally
- FIM doesn't work with Keep Alive = -1 HOT 1
- Cannot read long model names when configuring provider
- Option to save provider configuration to disk
- invalid option provided option=""
- Codeqwen uses same FIM template as stable-code HOT 1
- Context Length Option With File Context Enabled Doesn't Limit Length
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from twinny.