opensouls / terminal-copilot Goto Github PK
View Code? Open in Web Editor NEWA smart terminal assistant that helps you find the right command.
License: Apache License 2.0
A smart terminal assistant that helps you find the right command.
License: Apache License 2.0
Suggestion:
subprocess.run(['bash', '-c', '-i', 'alias'], capture_output=True)
in a try catch block with
{subprocess.run(["alias"], capture_output=True).stdout.decode("utf-8")}
Traceback (most recent call last):
File "/home/mano/anaconda3/envs/cpenv/lib/python3.9/site-packages/copilot/copilot.py", line 44, in main
{subprocess.run(["alias"], capture_output=True).stdout.decode("utf-8")}
File "/home/mano/anaconda3/envs/cpenv/lib/python3.9/subprocess.py", line 505, in run
with Popen(*popenargs, **kwargs) as process:
File "/home/mano/anaconda3/envs/cpenv/lib/python3.9/subprocess.py", line 951, in init
self._execute_child(args, executable, preexec_fn, close_fds,
File "/home/mano/anaconda3/envs/cpenv/lib/python3.9/subprocess.py", line 1821, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'alias'
When you ask terminal-copilot to explain a shell command, it crashes with the following:
> explainshell: https://explainshell.com/explain?cmd=tar%20-xzvf%20%3Cfilename%3E.tar.gz
Traceback (most recent call last):
File "/home/ryan/.local/bin/copilot", line 8, in <module>
sys.exit(main())
File "/home/ryan/.local/lib/python3.8/site-packages/copilot/main.py", line 116, in main
show_command_options(prompt, cmd)
File "/home/ryan/.local/lib/python3.8/site-packages/copilot/main.py", line 149, in show_command_options
show_more_cmd_options(prompt)
File "/home/ryan/.local/lib/python3.8/site-packages/copilot/main.py", line 179, in show_more_cmd_options
show_command_options(prompt, cmds[cmd_menu_entry_index])
File "/home/ryan/.local/lib/python3.8/site-packages/copilot/main.py", line 147, in show_command_options
subprocess.run(["open", "https://explainshell.com/explain?cmd=" + quote(cmd)])
File "/usr/lib/python3.8/subprocess.py", line 493, in run
with Popen(*popenargs, **kwargs) as process:
File "/usr/lib/python3.8/subprocess.py", line 858, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/usr/lib/python3.8/subprocess.py", line 1704, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'open'
This is expected, as a WSL instance doesn't have the open
ability.
I plan to make a PR to fix this, which will check if the VM is a WSL instance, and if so, use wslview
or another command instead to open the URL. ETA for the PR is about a month once I find the time.
No longer use this in preference of github copilot's ai shell: https://githubnext.com/projects/copilot-cli
Probably won't have the time to contribute this, but I'll keep this open for anyone else who runs into this
When invoking the copilot
command-line option on a Windows 11 machine, a TypeError
is raised indicating a missing argument for the function build_conversation()
.
Executing the copilot
command should not produce any errors, and it should function as intended.
(.venv)
copilot
command without the -q
switch: copilot list files
Upon execution, the following error is displayed:
(.venv) PS C:\Users\Owner\zgit\terminal-copilot> copilot list files
Traceback (most recent call last):
File "C:\Users\Owner\zgit\terminal-copilot\.venv\Scripts\copilot-script.py", line 33, in <module>
sys.exit(load_entry_point('terminal-copilot==1.4.0', 'console_scripts', 'copilot')())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Owner\zgit\terminal-copilot\.venv\Lib\site-packages\terminal_copilot-1.4.0-py3.11.egg\copilot\main.py", line 74, in main
conversation = build_conversation(context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: build_conversation() missing 1 required positional argument: 'usermessage'
When using the -q
switch, the error does not occur. For instance:
(.venv) PS C:\Users\Owner\zgit\terminal-copilot> copilot -q hello copilot
> Hello! How can I assist you today?
Hello! How can I assist you today?
NOTE: Windows installation steps are slightly different from the documentation
source .venv/bin/activate
use . .\.venv\Scripts\activate
export OPENAI_API_KEY=<your key>
, use $ENV:OPENAI_API_KEY = "YOUR_KEY"
Do we need to pay for the openai API to be able to use this app?
Referencing this:
Running on MacOS with zsh and using Python 3.11. Typically, I get this:
❯ copilot list directory
Traceback (most recent call last):
File "/Users/jmoran/anaconda3/bin/copilot", line 8, in <module>
sys.exit(main())
^^^^^^
File "/Users/jmoran/anaconda3/lib/python3.11/site-packages/copilot/main.py", line 74, in main
conversation = build_conversation(context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: build_conversation() missing 1 required positional argument: 'usermessage'
But this is an easy fix: I imagine that the python version this was tested is <3.11. I suggest changing the function definition's first line to
def build_conversation(context: Context, usermessage: Optional[str]=None) -> Conversation:
which works fine for me (although I am not sure yet this doesn't break something else).
On first use, it is helpfully suggested to export the OpenAI key.
Line 60:
print("export OPENAI_API_KEY = <your key>")
Following this exactly give an error: "zsh: bad assignment"
If you remove the spaces from the export suggestion, users won't have to google the correct syntax:
print("export OPENAI_API_KEY=<your key>")
Thanks for building a great little tool :-)
Be mindful of Context Length. Number of aliases could get out of hand.
e.g. My personal setup possibly has close to 1000 tokens for alias
alone.
Please add a general question mode so that I do not have to leave the command mode! For e.g.
$> copilot -g "Who was the 23rd president?"
Herbert Hoover
Looks like the dependency of simple-term-menu package makes impossible to use this in Windows: IngoMeyer441/simple-term-menu#71
Traceback (most recent call last):
File "C:\Users\ju195\Anaconda3\lib\runpy.py", line 197, in _run_module_as_main
return _run_code(code, main_globals, None,
File "C:\Users\ju195\Anaconda3\lib\runpy.py", line 87, in run_code
exec(code, run_globals)
File "C:\Users\ju195\Anaconda3\Scripts\copilot.exe_main.py", line 4, in
File "C:\Users\ju195\Anaconda3\lib\site-packages\copilot\copilot.py", line 8, in
from simple_term_menu import TerminalMenu
File "C:\Users\ju195\Anaconda3\lib\site-packages\simple_term_menu.py", line 39, in
raise NotImplementedError('"{}" is currently not supported.'.format(platform.system())) from e
NotImplementedError: "Windows" is currently not supported.
Could probably replace a few == with > or ~=
Currently the "explain" option uses explainshell web page to explain the command.
Would be cool to design some prompts for openai that could also explain the command it proposed, and break it down into components. This would probably make it easier to verify the command proposed is correct.
Since GPT is a few-shot learner model see: Language Models are Few-Shot Learners we can significantly improve the model performance by providing examples instead of only providing context in a zero shot approach.
copilot -q "What is the best drink for late night coding?"
Traceback (most recent call last):
File "/home/eid/.local/bin/copilot", line 5, in <module>
from copilot.main import main
File "/home/eid/.local/lib/python3.8/site-packages/copilot/main.py", line 12, in <module>
from copilot.conversation import Conversation
File "/home/eid/.local/lib/python3.8/site-packages/copilot/conversation.py", line 20, in <module>
class Conversation:
File "/home/eid/.local/lib/python3.8/site-packages/copilot/conversation.py", line 21, in Conversation
messages: list[dict]
TypeError: 'type' object is not subscriptable
I tried the following commands:
conda create --name myenv python=3.7
conda activate myenv
The following command successfully installed the library
pip install terminal-copilot
With the following output message (showing only the final line)
Successfully installed charset-normalizer-2.1.1 et-xmlfile-1.1.0 idna-3.4 numpy-1.21.6 openai-0.25.0 openpyxl-3.0.10 pandas-1.3.5 pandas-stubs-1.2.0.62 pyperclip-1.8.2 python-dateutil-2.8.2 pytz-2022.7 requests-2.28.1 simple-term-menu-1.5.2 six-1.16.0 terminal-copilot-1.2.1 tqdm-4.64.1 types-pytz-2022.7.0.0 typing-extensions-4.4.0 urllib3-1.26.13
The following command raises an error
copilot list all files in the parent directory
The error
Traceback (most recent call last):
File "/Users/username/opt/anaconda3/envs/myenv/bin/copilot", line 8, in <module>
sys.exit(main())
File "/Users/username/opt/anaconda3/envs/myenv/lib/python3.7/site-packages/copilot/copilot.py", line 87, in main
"""
File "/Users/username/opt/anaconda3/envs/myenv/lib/python3.7/site-packages/copilot/history.py", line 46, in get_history
return _zsh_history(history_context_size)
File "/Users/username/opt/anaconda3/envs/myenv/lib/python3.7/site-packages/copilot/history.py", line 29, in _zsh_history
lines = history_file.zsh_history_file_lines()
File "/Users/username/opt/anaconda3/envs/myenv/lib/python3.7/site-packages/copilot/history_file.py", line 41, in zsh_history_file_lines
lines = history.read().splitlines()
File "/Users/username/opt/anaconda3/envs/myenv/lib/python3.7/codecs.py", line 322, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xa8 in position 1648: invalid start byte
[no error to see here :)]
In version 1.2.2, terminal-copilot
sends the last 40 commands in your history to OpenAI, even if the --history
flag is not present. This bug has been fixed in version 1.2.3 – thank you to the authors for finding and solving it. I wonder if there is a way to notify people who downloaded the earlier version.
Thank you for this fantastic piece of software!
I appreciated seeing the thoughtful note in your README about sensitive information. I saw this line:
If you are concerned about the potential for sensitive information to be sent to OpenAI, we recommend not using these flags [i.e.
--alias
,--git
,--history
, etc.].
This made me think that no sensitive information would be transmitted to OpenAI if I avoided those flags. However, when I run terminal-copilot with the -v
flag, I can see that it transmits a listing of the current directory as part of the prompt. You could clarify that this transmission will happen, even if no special flags are used. It might also be useful to let users disable this transmission. Another reason it is sometimes useful to disable this feature is that if the directory listing is very long, it can lead to an unexpected openai.error.InvalidRequestError
.
Adding memory for prior copilot searches could make it a bit more like a conversation where prior context persists to help find the correct command
Traceback (most recent call last):
File "/usr/local/bin/copilot", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.10/site-packages/copilot/copilot.py", line 83, in main
{history.get_history() if args.history and operating_system.lower().startswith("lin") or operating_system.lower().startswith("dar") else ""}
File "/usr/local/lib/python3.10/site-packages/copilot/history.py", line 46, in get_history
return _zsh_history(history_context_size)
File "/usr/local/lib/python3.10/site-packages/copilot/history.py", line 29, in _zsh_history
lines = history_file.zsh_history_file_lines()
File "/usr/local/lib/python3.10/site-packages/copilot/history_file.py", line 41, in zsh_history_file_lines
lines = history.read().splitlines()
File "/usr/local/Cellar/[email protected]/3.10.8/Frameworks/Python.framework/Versions/3.10/lib/python3.10/codecs.py", line 322, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xbc in position 2703: invalid start byte
All the system to do multi-step reasoning to arrive at the correct command. For example if the system needs to know what ssh keys you have or what's in a directory, it should first reason what commands to get that information and then compose that with sequent commands.
For example, it failed on the command in the image below:
I don't have a key named id_rsa, which is the example key name used in the prompt when generating a key with ssh-keygen
:
Generating public/private rsa key pair.
Enter file in which to save the key (/Users/justin/.ssh/id_rsa)
Ideally, it should first do ls /Users/justin/.ssh/
and then find all keys with the word github
in it and also know to cat ~/.ssh/config
for Hosts *github.com
etc.
I see that this package uses text-davinci-003 as its standard model: https://github.com/Methexis-Inc/terminal-copilot/blob/main/copilot/main.py#L186
but OpenAI recommends the new text-embedding-ada-002 for nearly all use cases, which is substantially cheaper and better: https://beta.openai.com/docs/guides/embeddings/embedding-models
Regardless, one of the code models (e.g., code-davinci-002
) is pretrained specifically on code examples and might be better.
Are there any plans to switch, or is there a good reason we're using text-davinci-003?
To reproduce
> copilot split one list into two equally sized lists with a python oneliner
> explainshell: https://explainshell.com/explain?cmd=python%20-c%20%22print%28%5Bx%20for%20x%20in%20range%280%2C%20len%28input%28%29%29%2C%202%29%5D%29%22%20%7C%20xargs%20-n%202
Traceback (most recent call last):
File "/usr/local/bin/copilot", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.8/dist-packages/copilot/copilot.py", line 99, in main
subprocess.run(["open", "https://explainshell.com/explain?cmd=" + quote(cmd)])
File "/usr/lib/python3.8/subprocess.py", line 493, in run
with Popen(*popenargs, **kwargs) as process:
File "/usr/lib/python3.8/subprocess.py", line 858, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/usr/lib/python3.8/subprocess.py", line 1704, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'open'
I think open might be specifically a MacOS unix command.
I tried linking openvt to open but got the following openvt error
Couldn't get a file descriptor referring to the console
It does however work if I link xdg-open to open OR wslview to open so using that as a workaround for now.
I'm running into these errors on Debian 11
copilot show history
> history
execute
copy
explain shell
execute :
sh: 1: history: not found
copy :
> history
> copied
Traceback (most recent call last):
File "/home/[USER]/.local/bin/copilot", line 8, in <module>
sys.exit(main())
File "/home/[USER]/.local/lib/python3.9/site-packages/copilot/copilot.py", line 95, in main
subprocess.run(["pbcopy"], input=cmd, encoding="utf-8")
File "/usr/lib/python3.9/subprocess.py", line 505, in run
with Popen(*popenargs, **kwargs) as process:
File "/usr/lib/python3.9/subprocess.py", line 951, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/usr/lib/python3.9/subprocess.py", line 1823, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'pbcopy'
After seeing the code, I understand you're asking ChatGPT for commands for zsh in macOS.
You should either detect the OS and shell being used, make it clear the environment is not supported when running the command, or at least make it clear on the documentation/README that this is the case.
Currently failing on bash actually because it can't find the "alias" command (which is a builtin in bash, not a command).
Adding bash history to the prompt would almost certain improve accuracy
Would be cool to save all prompts (sqlite?) and their responses. This will help to control cost.
If I type exact same command again, the results can be shown from the database.
Why does the prompt need env vars to come up with the right command?
Concern: Credentials are often stored in env variables.
Recommendation: I'd recommend against including env vars unless absolutely necessary.
The program does not run with a non-UTF8 locale crashing with the error message:
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe8 in position 3444: invalid continuation byte
It does not seem to be related to the terminal emulator I'm using: reproduced with konsole
and aterm
.
Here is the complete output of my attempt to install and try out terminal-copilot
on Gentoo Linux:
kirx@eridan ~ $ pip install terminal-copilot --user
Collecting terminal-copilot
Downloading terminal_copilot-1.0.7-py3-none-any.whl (8.2 kB)
Requirement already satisfied: charset-normalizer==2.1.1 in /usr/lib64/python3.8/site-packages (from terminal-copilot) (2.1.1)
Collecting simple-term-menu==1.5.2
Downloading simple_term_menu-1.5.2-py3-none-any.whl (26 kB)
Collecting typing-extensions==4.4.0
Downloading typing_extensions-4.4.0-py3-none-any.whl (26 kB)
Requirement already satisfied: idna==3.4 in /usr/lib64/python3.8/site-packages (from terminal-copilot) (3.4)
Collecting numpy==1.23.5
Downloading numpy-1.23.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.1 MB)
---------------------------------------- 17.1/17.1 MB 2.4 MB/s eta 0:00:00
Collecting openai==0.25.0
Downloading openai-0.25.0.tar.gz (44 kB)
--------------------------------------- 44.9/44.9 kB 825.8 kB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... done
Preparing metadata (pyproject.toml) ... done
Collecting et-xmlfile==1.1.0
Downloading et_xmlfile-1.1.0-py3-none-any.whl (4.7 kB)
Requirement already satisfied: python-dateutil==2.8.2 in /usr/lib64/python3.8/site-packages (from terminal-copilot) (2.8.2)
Collecting pytz==2022.6
Downloading pytz-2022.6-py2.py3-none-any.whl (498 kB)
------------------------------------- 498.1/498.1 kB 622.6 kB/s eta 0:00:00
Requirement already satisfied: requests==2.28.1 in /usr/lib64/python3.8/site-packages (from terminal-copilot) (2.28.1)
Collecting certifi==2022.12.7
Downloading certifi-2022.12.7-py3-none-any.whl (155 kB)
--------------------------------------- 155.3/155.3 kB 1.7 MB/s eta 0:00:00
Collecting types-pytz==2022.6.0.1
Downloading types_pytz-2022.6.0.1-py3-none-any.whl (4.7 kB)
Collecting pandas-stubs==1.5.2.221124
Downloading pandas_stubs-1.5.2.221124-py3-none-any.whl (146 kB)
--------------------------------------- 146.4/146.4 kB 1.8 MB/s eta 0:00:00
Requirement already satisfied: six==1.16.0 in /usr/lib64/python3.8/site-packages (from terminal-copilot) (1.16.0)
Collecting openpyxl==3.0.10
Downloading openpyxl-3.0.10-py2.py3-none-any.whl (242 kB)
------------------------------------- 242.1/242.1 kB 895.9 kB/s eta 0:00:00
Collecting pandas==1.5.2
Downloading pandas-1.5.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (12.2 MB)
---------------------------------------- 12.2/12.2 MB 1.7 MB/s eta 0:00:00
Collecting tqdm==4.64.1
Downloading tqdm-4.64.1-py2.py3-none-any.whl (78 kB)
---------------------------------------- 78.5/78.5 kB 1.0 MB/s eta 0:00:00
Collecting urllib3==1.26.13
Downloading urllib3-1.26.13-py2.py3-none-any.whl (140 kB)
--------------------------------------- 140.6/140.6 kB 1.1 MB/s eta 0:00:00
Building wheels for collected packages: openai
Building wheel for openai (pyproject.toml) ... done
Created wheel for openai: filename=openai-0.25.0-py3-none-any.whl size=55859 sha256=cb542f4654240372aefa0c06bf92b541a89d6a3c7a4e9446fc528ff7593fafb6
Stored in directory: /home/kirx/.cache/pip/wheels/9b/ec/6b/a7a72f2bcb08749a0b4be510b397fda58c364b7216fc114e5e
Successfully built openai
Installing collected packages: types-pytz, pytz, urllib3, typing-extensions, tqdm, simple-term-menu, pandas-stubs, numpy, et-xmlfile, certifi, pandas, openpyxl, openai, terminal-copilot
Attempting uninstall: tqdm
Found existing installation: tqdm 4.60.0
Uninstalling tqdm-4.60.0:
Successfully uninstalled tqdm-4.60.0
Successfully installed certifi-2022.12.7 et-xmlfile-1.1.0 numpy-1.23.5 openai-0.25.0 openpyxl-3.0.10 pandas-1.5.2 pandas-stubs-1.5.2.221124 pytz-2022.6 simple-term-menu-1.5.2 terminal-copilot-1.0.7 tqdm-4.64.1 types-pytz-2022.6.0.1 typing-extensions-4.4.0 urllib3-1.26.13
kirx@eridan ~ $ copilot list all files
Traceback (most recent call last):
File "/home/kirx/.local/bin/copilot", line 8, in <module>
sys.exit(main())
File "/home/kirx/.local/lib/python3.8/site-packages/copilot/copilot.py", line 48, in main
{subprocess.run(["ls"], capture_output=True).stdout.decode("utf-8")}
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe8 in position 4671: invalid continuation byte
kirx@eridan ~ $ export LANG=C
kirx@eridan ~ $ copilot list all files
Traceback (most recent call last):
File "/home/kirx/.local/bin/copilot", line 8, in <module>
sys.exit(main())
File "/home/kirx/.local/lib/python3.8/site-packages/copilot/copilot.py", line 48, in main
{subprocess.run(["ls"], capture_output=True).stdout.decode("utf-8")}
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe8 in position 3444: invalid continuation byte
kirx@eridan ~ $ copilot how to get local time
Traceback (most recent call last):
File "/home/kirx/.local/bin/copilot", line 8, in <module>
sys.exit(main())
File "/home/kirx/.local/lib/python3.8/site-packages/copilot/copilot.py", line 48, in main
{subprocess.run(["ls"], capture_output=True).stdout.decode("utf-8")}
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe8 in position 3444: invalid continuation byte
kirx@eridan ~ $ locale
LANG=C
LC_CTYPE="C"
LC_NUMERIC="C"
LC_TIME="C"
LC_COLLATE="C"
LC_MONETARY="C"
LC_MESSAGES="C"
LC_PAPER="C"
LC_NAME="C"
LC_ADDRESS="C"
LC_TELEPHONE="C"
LC_MEASUREMENT="C"
LC_IDENTIFICATION="C"
LC_ALL=
kirx@eridan ~ $ uname -a
Linux eridan 5.11.8-gentoo-r1-x86_64 #1 SMP Sun Mar 28 03:47:57 UTC 2021 x86_64 Intel(R) Core(TM) i7-4702MQ CPU @ 2.20GHz GenuineIntel GNU/Linux
kirx@eridan ~ $ env
SHELL=/bin/bash
WINDOWID=37748738
COLORTERM=rxvt
XDG_CONFIG_DIRS=/etc/xdg
LESS=-R -M --shift 5
JDK_HOME=/etc/java-config-2/current-system-vm
CONFIG_PROTECT_MASK=/etc/sandbox.d /etc/php/cli-php7.4/ext-active/ /etc/php/cgi-php7.4/ext-active/ /etc/php/apache2-php7.4/ext-active/ /etc/php/cli-php8.0/ext-active/ /etc/php/cgi-php8.0/ext-active/ /etc/php/apache2-php8.0/ext-active/ /etc/php/cli-php8.1/ext-active/ /etc/php/cgi-php8.1/ext-active/ /etc/php/apache2-php8.1/ext-active/ /etc/php/fpm-php8.1/ext-active/ /etc/php/phpdbg-php8.1/ext-active/ /etc/fonts/fonts.conf /etc/gentoo-release /etc/terminfo /etc/dconf /etc/ca-certificates.conf /etc/texmf/web2c /etc/texmf/language.dat.d /etc/texmf/language.def.d /etc/texmf/updmap.d /etc/revdep-rebuild
HISTSIZE=
PGPLOT_DIR=/usr/lib64/pgplot/
JAVA_HOME=/etc/java-config-2/current-system-vm
iraf=/mnt/usb/iraf/iraf/
CALDBCONFIG=/mnt/usb/caldb//software/tools/caldb.config
HISTTIMEFORMAT=[%F %T]
ANT_HOME=/usr/share/ant
ADS_API_TOKEN=lVivFKGO6bcxurjfKeN25SFswFtg7EOuT0vggcQp
EDITOR=/usr/bin/joe
PWD=/home/kirx
CONFIG_PROTECT=/usr/share/gnupg/qualified.txt /usr/share/config /usr/lib64/libreoffice/program/sofficerc
LOGNAME=kirx
PGPLOT_FONT=/usr/lib64/pgplot//grfont.dat
MANPATH=/etc/java-config-2/current-system-vm/man:/usr/share/gcc-data/x86_64-pc-linux-gnu/11.2.1/man:/usr/share/binutils-data/x86_64-pc-linux-gnu/2.38/man:/etc/java-config-2/current-system-vm/man/:/usr/lib64/php7.4/man/:/usr/lib64/php8.0/man/:/usr/lib64/php8.1/man/:/usr/local/share/man:/usr/share/man:/usr/lib/rust/man:/usr/lib/llvm/15/share/man:/usr/lib/llvm/14/share/man:/usr/lib/llvm/13/share/man
IRAFARCH=
XAUTHORITY=/home/kirx/.Xauthority
OPENCL_PROFILE=beignet
SCHED=/home/kirx/sched
WINDOWPATH=7
HOME=/home/kirx
LANG=C
HISTFILE=/home/kirx/.bash_eternal_history
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=01;05;37;41:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.webp=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.cfg=00;32:*.conf=00;32:*.diff=00;32:*.doc=00;32:*.ini=00;32:*.log=00;32:*.patch=00;32:*.pdf=00;32:*.ps=00;32:*.tex=00;32:*.txt=00;32:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:
FLTK_DOCDIR=/usr/share/doc/fltk-1.3.5-r4/html
DIFMAP_LOGIN=/home/kirx/.difmap
OPENGL_PROFILE=xorg-x11
QT_GRAPHICSSYSTEM=raster
HEADAS=/mnt/usb/heasoft-6.30.1/x86_64-pc-linux-gnu-libc2.32
CSPICE_PLANETDATA_FILE=/home/kirx/.vartools/pck00010.tpc
INFOPATH=/usr/share/gcc-data/x86_64-pc-linux-gnu/11.2.1/info:/usr/share/binutils-data/x86_64-pc-linux-gnu/2.38/info:/usr/share/info:/usr/share/info/emacs-27
MOZ_GMP_PATH=/usr/lib64/nsbrowser/plugins/gmp-gmpopenh264/system-installed
JAVAC=/etc/java-config-2/current-system-vm/bin/javac
XEHELPURL=/usr/share/doc/xephem-4.1.0/html/xephem.html
TERMINFO=/usr/share/terminfo
TERM=rxvt
LESSOPEN=|lesspipe %s
USER=kirx
CSPICE_EPHEM_FILE=/home/kirx/.vartools/de432s.bsp
COLORFGBG=0;15
MANPAGER=manpager
DISPLAY=:0.0
CALDB=/mnt/usb/caldb/
SHLVL=3
PAGER=/usr/bin/less
LD_LIBRARY_PATH=/usr/local/lib::.
CALDBALIAS=/mnt/usb/caldb//software/tools/alias_config.fits
GCC_SPECS=
GSETTINGS_BACKEND=dconf
XDG_DATA_DIRS=/usr/local/share:/usr/share
PATH=/home/kirx/mybin:/home/kirx/.local/bin:/home/kirx/current_work/vast/util/ccd:/home/kirx/cod/wcstools-3.9.4/bin:/opt/sun-jre-bin-1.6.0.02/bin/:/home/kirx/.iraf/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/opt/bin:/usr/lib/llvm/15/bin:/usr/lib/llvm/14/bin:/usr/lib/llvm/13/bin:/usr/bin/cdsclient:/home/kirx/.gem/ruby/1.8/bin/:/usr/games/bin/
VBOX_APP_HOME=/usr/lib64/virtualbox
CSPICE_LEAPSEC_FILE=/home/kirx/.vartools/naif0011.tls
HISTFILESIZE=
OLDPWD=/home/kirx/video
_=/usr/bin/env
Previously, I had issues with various python software mysteriously crashing after seeing my non-UTF8 locale, but all the previous problems could be fixed by exporting LANG=C
.
Worked great the other day, now I'm getting this, no matter the command:
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 6365 tokens (6109 in your prompt; 256 for the completion). Please reduce your prompt; or completion length.
git clone https://github.com/opensouls/terminal-copilot.git
cd terminal-copilot
python3 -m venv .venv
is used to create a new environment.source .venv/bin/activate
.pip install -r requirements.txt
.python3 setup.py install
copilot
:Traceback (most recent call last):
File "/home/deftera/terminal-copilot/.venv/bin/copilot", line 33, in <module>
sys.exit(load_entry_point('terminal-copilot==1.2.2', 'console_scripts', 'copilot')())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/deftera/terminal-copilot/.venv/bin/copilot", line 25, in importlib_load_entry_point
return next(matches).load()
^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/importlib/metadata/__init__.py", line 202, in load
module = import_module(match.group('module'))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/importlib/__init__.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen importlib._bootstrap>", line 1206, in _gcd_import
File "<frozen importlib._bootstrap>", line 1178, in _find_and_load
File "<frozen importlib._bootstrap>", line 1149, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 936, in exec_module
File "<frozen importlib._bootstrap_external>", line 1074, in get_code
File "<frozen importlib._bootstrap_external>", line 1004, in source_to_code
File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
File "/home/deftera/terminal-copilot/.venv/lib/python3.11/site-packages/terminal_copilot-1.2.2-py3.11.egg/copilot/main.py", line 246
"""
^
SyntaxError: unterminated triple-quoted string literal (detected at line 252)
The problem seems to be when we are adding prompt += """
. Which can be proven, by removing the line, and fixing the problem. The thing is, I don't know what was actually intended (it could be adding a quote, which can be solved by escaping it, or surround it using single quotes if what was intended was to add triple quotes).
openai.error.InvalidRequestError: The model
text-davinci-003
has been deprecated, learn more here: https://platform.openai.com/docs/deprecations
One of the commands I most often in the terminal is reverse search CTRL+R
however you have to remember the first exact letters of an old command to use it.
What if copilot had a built in reverse search that was more like fuzzy matching.
It looks at your command history and picks the most semantically similar (could probably do an embedding search or just use a prompt to select which one.
I imagine it working like
copilot -rs that command I used in the past but forgot exactly how it was typed.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.