Comments (5)
Thanks, it was probably some configuration issue because the minor change which has some new settings for api connections and I expected some breaking changes. I just pushed a fix for the 0
displaying because the models weren't loading.
from twinny.
okay nvm after deleting and resetting the profiles it works again. (and rebooting ollama)
from twinny.
Thanks for your work! my 3090 goes brr again
I had to reset the settings and then select the models again. like chat needs to be top and then fim. but now it works.
from twinny.
btw how to select the active provider... when I add more than just 1 fim and chat provider
I tried to copy one then fim was on top and chat copy was second but then chat become unresponsive, maybe fim too.
It said ollama chat Copy in the title of the second provider. ( then I pressed reset, and selected my models again and it worked.) ( not complaining, I am happy lol)
from twinny.
Hey, I am not sure if still an issue :). To select the provider click the little robot icon above the chat box. I see in your FIM provider you use deepseek model but the template is codellama, you should use automatic
or deepseek
as your fim template too.
from twinny.
Related Issues (20)
- VSCodium reports as it is not compatible with VS Code '1.81.1'. HOT 2
- Twinny for Visual Studio Community 2022 HOT 1
- Instructions for the configuration on MacOS with llama.cpp HOT 3
- Inline suggestion does not work when there is a .hg folder HOT 1
- Unable to interact with ollama running VSCode with WSL2 HOT 3
- Business use HOT 1
- Not showing all model after update to latest version HOT 8
- LM Studio supports "Multi Mode Session", must specify the model name HOT 3
- Extension does not load at all after update to v3.11.18 HOT 2
- Can't get the FIM to work with LM Studio HOT 3
- Keyboard Shortcuts - Add opt-in option HOT 2
- Type error fetch failed t.streamResponse HOT 4
- Keyboard short cut: accept part of the FIM suggestion HOT 4
- TypeError while attempting to use FIM from remote Open WebUI server HOT 4
- Wrong language: getting python suggestions in Javascript file HOT 1
- Unable to select model when using Ollama HOT 3
- command 'twinny.showSidebar' not found HOT 1
- Change shortcuts for suggestions HOT 2
- Improve documentation for FIM HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from twinny.