dpayne / codegpt.nvim Goto Github PK
View Code? Open in Web Editor NEWCodeGPT is a plugin for neovim that provides commands to interact with ChatGPT.
License: GNU General Public License v3.0
CodeGPT is a plugin for neovim that provides commands to interact with ChatGPT.
License: GNU General Public License v3.0
Hi,
With the recent introduction of Codestral (https://mistral.ai/news/codestral/), I am wondering if a support is planned or would be easy to implement for nvim, particularly with the FIM feature. So far, I haven't seen any implementation that handles this for Vim.
What I have in mind is the ability to activate the FIM mode in "copilot" mode to suggest auto-completion at the cursor automatically with a customizable backend. Specifically, I would like to:
What do you think about this ?
Please add a question command:
command | input | Description |
---|---|---|
question | text selection and command args | Will ask ChatGPT a question (the command args) to the selected code. |
I think this is a quiet frequent use case. We already have "explain" but this offers a generic explanation and not a specific explanation to a given question for selected code.
As I'm trying to get this configured I've run into the following issue after any :Chat
call:
Error: post https://api.openai.com/v1/chat/completions - curl error exit_code=23
stderr={ "curl: (23) Failed writing received data to disk/application" }.
What else could be happening?
Hi,
I am getting this error I am calling :Chat since #42 I guess.
.vim/bundle/CodeGPT.nvim/plugin/codegpt.lua:5: attempt to index upvalue 'CodeGptModule' (a boolean value)
missing "dpayne"
I used:
use({
"dpayne/CodeGPT.nvim",
requires = "nvim-lua/plenary.nvim", "MunifTanjim/nui.nvim",
})
Hi there, thanks for making this neat plugin! I have been trying to add language instructions as per the README but haven't had much luck. When following the Example configuration section to add instructions to use pytest for python tests, it appeared to not be respected:
require("codegpt.config")
vim.g["codegpt_commands"] = {
["tests"] = {
language_instructions = {
python = "Use the pytest unit framework",
},
}
}
Adding a print for debugging showed that the prompt didn't include these instructions, it said the following (two spaces between the sentences point to the language_instructions template not being rendered):
Write really good unit tests using best practices for the given language. Only return the unit tests.
I did confirm that the buffer I was in had the appropriate filetype.
I was able to get it working with this configuration; is the documentation out of date or is this a bug? (using this functionality)
vim.g.codegpt_lang_instructions_python_tests = "Use the pytest test framework."
Would be happy to try and contribute a fix if you think this is worth fixing. Thanks!
When using nvim with the setting vim.o.wrap = false
, the popup that appears when using the Chat command :Chat hello world
does also not wrap the text around.
This is probably expected behavior, but for this use case, I think no one would use it without wrapping, so setting wrapping to true by default on this popup window would probably be better.
It would be nice to have the class/function, etc. definitions pulled into the context when we select some code.
Features:
Additional thoughts:
lua/codegpt.lua makes it clear that when there is no text selection, but there are args, then the command must be chat. Why is that?
if text_selection ~= "" and command_args ~= "" then local cmd_opts = CommandsList.get_cmd_opts(command) if cmd_opts ~= nil and has_command_args(cmd_opts) then command_args = table.concat(opts.fargs, " ", 2) elseif cmd_opts and 1 == #opts.fargs then command_args = "" else command = "code_edit" end elseif text_selection ~= "" and command_args == "" then command = "completion" elseif text_selection == "" and command_args ~= "" then command = "chat" end
For example, this custom command will not work, unless you select some arbitrary whitespace first:
vim.g["codegpt_commands"] = vim.tbl_extend("force", vim.g["codegpt_commands"] or {},
{
["eli"] = {
system_message_template = "You are a tutor to a ten year old child. Explain everything using simple English.",
user_message_template = "{{command_args}}",
callback_type = "text_popup",
}
})
I would expect doing :Chat eli how are cars made?
to use the custom template, but it actually just reverts to "you are a general assistant to a software developer". This behaviour works if I select some whitespace and do :'<,'>Chat eli how are cars made?
A current workaround is to define your own command that directly calls run_cmd with the right command name:
vim.api.nvim_create_user_command("Eli", function(opts)
require("codegpt.commands").run_cmd("eli", table.concat(opts.fargs, " "), "")
end, {range = true, nargs = "*"})
Now :Eli how are cars made?
works as expected. This shouldn't be necessary.
when I configure openai_api_key in config.lua like below:
vim.g["codegpt_openai_api_key"] = os.getenv("sk-**********************I")
how do I fix this bug?
Error Message:
Error executing Lua callback: ...ack/packer/start/CodeGPT.nvim/lua/codegpt/openai_api.lua:87: OpenAIApi Key not found, set in vim with 'codegpt_openai_api_key' or as the env variable 'OPENAI
_API_KEY'
stack traceback:
[C]: in function 'error'
...ack/packer/start/CodeGPT.nvim/lua/codegpt/openai_api.lua:87: in function 'make_call'
.../pack/packer/start/CodeGPT.nvim/lua/codegpt/commands.lua:53: in function 'run_cmd'
...nvim/site/pack/packer/start/CodeGPT.nvim/lua/codegpt.lua:34: in function 'run_cmd'
...m/site/pack/packer/start/CodeGPT.nvim/plugin/codegpt.lua:4: in function <...m/site/pack/packer/start/CodeGPT.nvim/plugin/codegpt.lua:3>
Hello!
Thank you for this great plugin!
Could you provide under which license is the repository/plugin?
Hi,
Thanks for the plugin :), since it can take a few seconds to have the ChatGPT response it would be cool have some sort of UI loading feedback (e.g statusline loading spinner).
I suggest having some callbacks like:
vim.g["codegpt_hooks"] = {
request_started = function()
-- update some ui, start a spinner
end,
request_finished = function()
-- update some ui, stop the spinner
end
}
A lualine component would be 🔥 .
To my understanding, the only true difference is that CodeGPT implements actions from lua snippets, while ChatGPT also allows external json to be loaded, plus a number of utilities, UI, etc.
Is there any other difference?
Hello. Any idea why this might be happening? Sometimes the result is correct, but half the time you get this broken result.
Is there a way to debug actual API response to see if the problem is with the autocompleted result and not in the way plugin replaces the selected lines with the ChatGPT result?
Thanks!
Would it be possible to make the ChatGPT output show up in a new horizontal buffer? This would make it easier to review the output and copy and paste what is needed back in to the project that I'am working.
Thanks for creating the plugin. It's been fun to experiment with ChatGPT.
Please add a generate command:
command | input | Description |
---|---|---|
generate | command args | Will ask ChatGPT to generate the command args in the language of the buffer |
This is like completion in that it only outputs the code and puts that into the buffer, but without requiring any existing code. For example, if I open a new file and do :Chat generate fibonacci function
it should insert a function called fibonacci with the appropriate implementation in the language indicated by the language of the buffer.
I have a yaml file where i want to reformulate some texts, or ensure there is no grammar error in my description.
If i select one sentence, and ask ChatGPT to reformulate, it tells me that it is not valid YAML.
Example:
info:
title: Example
version: '1.0'
description: |
Bienvenue sur la documentation XXX. Cette API vous permet de contrôler YYY et ZZZ.
If i select just the line with "Bienvenue", and ask:
:Chat Corrige la phrase
The output will be:
I'm sorry, but the given input is not a valid YAML code snippet. YAML is a data serialization language and requires proper indentation and syntax.
Is there a way to avoid the yaml tagging of the selection ?
In the readme it says:
Custom Commands
Custom commands can be added to the
vim.g["codegpt_commands"]
configuration option to extend the available commands.vim.g["codegpt_commands"] = { ["modernize"] = { user_message_template = "I have the following {{language}} code: ```{{filetype}}\n{{text_selection}}```\nModernize the above code. Use current best practices. Only return the code snippet and comments. {{language_instructions}}", language_instructions = { cpp = "Refactor the code to use trailing return type, and the auto keyword where applicable.", }, } }The above configuration adds the command
:Chat modernize
that attempts modernize the selected code snippet.
So, trying it, I added exactly that to my vimrc, selected this piece of C++ code:
#include <iostream>
using namespace std;
int foobar(){
return 0;
}
bool whatnow(){
return 1;
}
int main(){
return 0;
}
And the call made by CodeGPT is not using the user_message_template at all, but doing this call:
{
max_tokens = 4003,
messages = { {
content = "You are a C++ coding assistant.",
role = "system"
}, {
content = "I have the following C++ code: ```cpp\n#include <iostream>\nusing namespace std;\n\nint foobar(){\n return 0;\n}\n\nbool whatnow(){\n return 1;\n}\n\nint main(){\n return 0;\n}```\nmodernize.
Only return the code snippet and nothing else.",
role = "user"
} },
model = "gpt-3.5-turbo",
n = 1,
temperature = 0.8
}
I know because I added this little snippet to my vimrc, which I suggest to add to the README under heading 'debugging':
OpenAIApi = require("codegpt.openai_api")
function OpenAIApi.make_call(payload, cb)
print(vim.inspect(payload))
end
Hi, first of all, thanks for this excellent plugin!
What I especially like about this plugin is that's it's quite easy to customize. There's one more thing I'd like to be able to customize though: the popup's border. Perhaps you can add a vim.g['codegpt_popup_border']
option or something like that, that defaults to { style = 'rounded' }
? Then you can simple change https://github.com/dpayne/CodeGPT.nvim/blob/master/lua/codegpt/ui.lua#L9 to:
border = vim.g['codegpt_popup_border'],
Thanks in advance!
OpenAI's definition of tokens is different from the length of the message. The models have much larger token capacity than message length-based limts.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.