GithubHelp home page GithubHelp logo

dpayne / codegpt.nvim Goto Github PK

View Code? Open in Web Editor NEW
749.0 749.0 46.0 13.54 MB

CodeGPT is a plugin for neovim that provides commands to interact with ChatGPT.

License: GNU General Public License v3.0

Lua 100.00%

codegpt.nvim's People

Contributors

00sapo avatar aap01 avatar baggiponte avatar blob42 avatar blockchainian avatar dpayne avatar freedomben avatar gstokkink avatar harry-optimised avatar icholy avatar jaime10a avatar jcdickinson avatar nmnduy avatar roobert avatar sherlockhoe90 avatar tamamcglinn avatar ttbug avatar yuchanns avatar zzhirong avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

codegpt.nvim's Issues

Support for FIM (Fill-in-the-middle) ?

Hi,

With the recent introduction of Codestral (https://mistral.ai/news/codestral/), I am wondering if a support is planned or would be easy to implement for nvim, particularly with the FIM feature. So far, I haven't seen any implementation that handles this for Vim.

What I have in mind is the ability to activate the FIM mode in "copilot" mode to suggest auto-completion at the cursor automatically with a customizable backend. Specifically, I would like to:

  • Control when the suggestion is triggered or not
  • Control the timeout to trigger the call
  • Control cycling through suggestions
  • Control the context window size send (prompt and suffix)

What do you think about this ?

Feature Request: chat question

Please add a question command:

command input Description
question text selection and command args Will ask ChatGPT a question (the command args) to the selected code.

I think this is a quiet frequent use case. We already have "explain" but this offers a generic explanation and not a specific explanation to a given question for selected code.

openSUSE curl (8.7.1) error "Failed writing received data to disk/application"

As I'm trying to get this configured I've run into the following issue after any :Chat call:

Error: post https://api.openai.com/v1/chat/completions - curl error exit_code=23 
stderr={ "curl: (23) Failed writing received data to disk/application" }.
  • Checked the OPENAI_API_KEY environmental variable
  • Checked that the tiktoken package is available

What else could be happening?

Language instructions appear to be ignored

Hi there, thanks for making this neat plugin! I have been trying to add language instructions as per the README but haven't had much luck. When following the Example configuration section to add instructions to use pytest for python tests, it appeared to not be respected:

require("codegpt.config")

vim.g["codegpt_commands"] = {
  ["tests"] = {
    language_instructions = {
        python = "Use the pytest unit framework",
    },
  }
}

Adding a print for debugging showed that the prompt didn't include these instructions, it said the following (two spaces between the sentences point to the language_instructions template not being rendered):

Write really good unit tests using best practices for the given language. Only return the unit tests.

I did confirm that the buffer I was in had the appropriate filetype.

I was able to get it working with this configuration; is the documentation out of date or is this a bug? (using this functionality)

vim.g.codegpt_lang_instructions_python_tests = "Use the pytest test framework."

Would be happy to try and contribute a fix if you think this is worth fixing. Thanks!

Text does not wrap around in the popup by default

When using nvim with the setting vim.o.wrap = false, the popup that appears when using the Chat command :Chat hello world does also not wrap the text around.
This is probably expected behavior, but for this use case, I think no one would use it without wrapping, so setting wrapping to true by default on this popup window would probably be better.
Screenshot 2023-03-08 at 20 50 53

Feature: automatically pull token definitions in the codebase into the context when the token is selected.

It would be nice to have the class/function, etc. definitions pulled into the context when we select some code.

Features:

  • When we select some code, find the definition of the class, functions, etc. in the codebase and put it into the context. Might require an external tool to achieve this.
  • Make this feature optional.

Additional thoughts:

  • If the function body is too large, maybe we should reduce the content of the function.
  • Add a parameter to limit how many class/function definitions to include in the context.

Enable custom commands with args and without text selection

lua/codegpt.lua makes it clear that when there is no text selection, but there are args, then the command must be chat. Why is that?

    if text_selection ~= "" and command_args ~= "" then
        local cmd_opts = CommandsList.get_cmd_opts(command)
        if cmd_opts ~= nil and has_command_args(cmd_opts) then
            command_args = table.concat(opts.fargs, " ", 2)
        elseif cmd_opts and 1 == #opts.fargs then
            command_args = ""
        else
            command = "code_edit"
        end
    elseif text_selection ~= "" and command_args == "" then
        command = "completion"
    elseif text_selection == "" and command_args ~= "" then
        command = "chat"
    end

For example, this custom command will not work, unless you select some arbitrary whitespace first:

vim.g["codegpt_commands"] = vim.tbl_extend("force", vim.g["codegpt_commands"] or {}, 
  {
    ["eli"] = {
        system_message_template = "You are a tutor to a ten year old child. Explain everything using simple English.",
        user_message_template = "{{command_args}}",
        callback_type = "text_popup",
    }
})

I would expect doing :Chat eli how are cars made? to use the custom template, but it actually just reverts to "you are a general assistant to a software developer". This behaviour works if I select some whitespace and do :'<,'>Chat eli how are cars made?

A current workaround is to define your own command that directly calls run_cmd with the right command name:

vim.api.nvim_create_user_command("Eli", function(opts)
    require("codegpt.commands").run_cmd("eli", table.concat(opts.fargs, " "), "")
end, {range = true, nargs = "*"})

Now :Eli how are cars made? works as expected. This shouldn't be necessary.

OpenAIApi Key not found, set in vim with 'codegpt_openai_api_key' or as the env variable 'OPENAI _API_KEY'

when I configure openai_api_key in config.lua like below:
vim.g["codegpt_openai_api_key"] = os.getenv("sk-**********************I")
how do I fix this bug?

Error Message:

Error executing Lua callback: ...ack/packer/start/CodeGPT.nvim/lua/codegpt/openai_api.lua:87: OpenAIApi Key not found, set in vim with 'codegpt_openai_api_key' or as the env variable 'OPENAI
_API_KEY'
stack traceback:
[C]: in function 'error'
...ack/packer/start/CodeGPT.nvim/lua/codegpt/openai_api.lua:87: in function 'make_call'
.../pack/packer/start/CodeGPT.nvim/lua/codegpt/commands.lua:53: in function 'run_cmd'
...nvim/site/pack/packer/start/CodeGPT.nvim/lua/codegpt.lua:34: in function 'run_cmd'
...m/site/pack/packer/start/CodeGPT.nvim/plugin/codegpt.lua:4: in function <...m/site/pack/packer/start/CodeGPT.nvim/plugin/codegpt.lua:3>

License

Hello!
Thank you for this great plugin!
Could you provide under which license is the repository/plugin?

Suggestion: Add lifecycle callbacks to support having loading indicators

Hi,

Thanks for the plugin :), since it can take a few seconds to have the ChatGPT response it would be cool have some sort of UI loading feedback (e.g statusline loading spinner).

I suggest having some callbacks like:

vim.g["codegpt_hooks"] = {
  request_started = function() 
   -- update some ui, start a spinner
  end,

  request_finished = function()
   -- update some ui, stop the spinner
  end
}

A lualine component would be 🔥 .

What's the difference with ChatGPT.nvim?

To my understanding, the only true difference is that CodeGPT implements actions from lua snippets, while ChatGPT also allows external json to be loaded, plus a number of utilities, UI, etc.

Is there any other difference?

Autocompleted result is broken

Hello. Any idea why this might be happening? Sometimes the result is correct, but half the time you get this broken result.

Is there a way to debug actual API response to see if the problem is with the autocompleted result and not in the way plugin replaces the selected lines with the ChatGPT result?

Kapture.2023-03-09.at.11.05.08.mp4

Thanks!

Feature Request: Horizontal Buffer Output

Would it be possible to make the ChatGPT output show up in a new horizontal buffer? This would make it easier to review the output and copy and paste what is needed back in to the project that I'am working.

Thanks for creating the plugin. It's been fun to experiment with ChatGPT.

Feature request: generate command

Please add a generate command:

command input Description
generate command args Will ask ChatGPT to generate the command args in the language of the buffer

This is like completion in that it only outputs the code and puts that into the buffer, but without requiring any existing code. For example, if I open a new file and do :Chat generate fibonacci function it should insert a function called fibonacci with the appropriate implementation in the language indicated by the language of the buffer.

ChatGPT think my selection is not valid YAML

I have a yaml file where i want to reformulate some texts, or ensure there is no grammar error in my description.
If i select one sentence, and ask ChatGPT to reformulate, it tells me that it is not valid YAML.

Example:

info:
  title: Example
  version: '1.0'
  description: |
    Bienvenue sur la documentation XXX. Cette API vous permet de contrôler YYY et ZZZ.

If i select just the line with "Bienvenue", and ask:

:Chat Corrige la phrase

The output will be:

I'm sorry, but the given input is not a valid YAML code snippet. YAML is a data serialization language and requires proper indentation and syntax.

Is there a way to avoid the yaml tagging of the selection ?

Adding custom commands does not work

In the readme it says:

Custom Commands

Custom commands can be added to the vim.g["codegpt_commands"] configuration option to extend the available commands.

vim.g["codegpt_commands"] = {
  ["modernize"] = {
      user_message_template = "I have the following {{language}} code: ```{{filetype}}\n{{text_selection}}```\nModernize the above code. Use current best practices. Only return the code snippet and comments. {{language_instructions}}",
      language_instructions = {
          cpp = "Refactor the code to use trailing return type, and the auto keyword where applicable.",
      },
  }
}

The above configuration adds the command :Chat modernize that attempts modernize the selected code snippet.

So, trying it, I added exactly that to my vimrc, selected this piece of C++ code:

#include <iostream>
using namespace std;

int foobar(){
  return 0;
}

bool whatnow(){
  return 1;
}

int main(){
  return 0;
}

And the call made by CodeGPT is not using the user_message_template at all, but doing this call:

{                                                                                                                                                                                                           
  max_tokens = 4003,
  messages = { {
      content = "You are a C++ coding assistant.",
      role = "system"
    }, {
      content = "I have the following C++ code: ```cpp\n#include <iostream>\nusing namespace std;\n\nint foobar(){\n  return 0;\n}\n\nbool whatnow(){\n  return 1;\n}\n\nint main(){\n  return 0;\n}```\nmodernize.
Only return the code snippet and nothing else.",
      role = "user"
    } },
  model = "gpt-3.5-turbo",
  n = 1,
  temperature = 0.8
}

I know because I added this little snippet to my vimrc, which I suggest to add to the README under heading 'debugging':

OpenAIApi = require("codegpt.openai_api")
function OpenAIApi.make_call(payload, cb)
  print(vim.inspect(payload))
end

Feature request: allow styling of popup border

Hi, first of all, thanks for this excellent plugin!

What I especially like about this plugin is that's it's quite easy to customize. There's one more thing I'd like to be able to customize though: the popup's border. Perhaps you can add a vim.g['codegpt_popup_border'] option or something like that, that defaults to { style = 'rounded' }? Then you can simple change https://github.com/dpayne/CodeGPT.nvim/blob/master/lua/codegpt/ui.lua#L9 to:

border = vim.g['codegpt_popup_border'],

Thanks in advance!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.