GithubHelp home page GithubHelp logo

circlestarzero / ex-chatgpt Goto Github PK

View Code? Open in Web Editor NEW
2.0K 2.0K 331.0 11.96 MB

Let ChatGPT truly learn how to go online and call APIs! 'EX-ChatGPT' can rival and even surpass NewBing

License: MIT License

TypeScript 34.21% Python 35.63% JavaScript 9.72% HTML 11.39% CSS 6.99% Dockerfile 0.83% Shell 1.23%

ex-chatgpt's Issues

请求时总是返回 EOF occurred in violation of protocol (_ssl.c:992)

  • aarch64 GNU/Linux 系统架构
  • 挂了代理了,容器内能 curl www.google.com
  • 容器内 urllib3==1.25.11
  • 错误日志如下:
192.168.1.109 - - [18/Mar/2023 18:30:45] "GET /api/APIProcess HTTP/1.1" 200 -

Debugging middleware caught exception in streamed response at a point where response headers were already sent.

Traceback (most recent call last):

  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 670, in urlopen

    httplib_response = self._make_request(

                       ^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 381, in _make_request

    self._validate_conn(conn)

  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 978, in _validate_conn

    conn.connect()

  File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 362, in connect

    self.sock = ssl_wrap_socket(

                ^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/urllib3/util/ssl_.py", line 386, in ssl_wrap_socket

    return context.wrap_socket(sock, server_hostname=server_hostname)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/ssl.py", line 517, in wrap_socket

    return self.sslsocket_class._create(

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/ssl.py", line 1075, in _create

    self.do_handshake()

  File "/usr/local/lib/python3.11/ssl.py", line 1346, in do_handshake

    self._sslobj.do_handshake()

ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:992)


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 439, in send

    resp = conn.urlopen(

           ^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 726, in urlopen

    retries = retries.increment(

              ^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/urllib3/util/retry.py", line 446, in increment

    raise MaxRetryError(_pool, url, error or ResponseError(cause))

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:992)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "/app/chatGPTEx/search.py", line 93, in web

    resp = directQuery(f'Query: {query}', conv_id=  conv_id)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/chatGPTEx/search.py", line 153, in directQuery

    response = chatbot.ask(prompt+'\n'+query,convo_id=conv_id)

               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/chatGPTEx/optimizeOpenAI.py", line 206, in ask

    full_response: str = "".join(response)

                         ^^^^^^^^^^^^^^^^^

  File "/app/chatGPTEx/optimizeOpenAI.py", line 160, in ask_stream

    response = self.session.post(

               ^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 590, in post

    return self.request('POST', url, data=data, json=json, **kwargs)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 542, in request

    resp = self.send(prep, **send_kwargs)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 655, in send

    r = adapter.send(request, **kwargs)

        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 514, in send

    raise SSLError(e, request=request)

requests.exceptions.SSLError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:992)')))

192.168.1.109 - - [18/Mar/2023 18:30:45] "GET /api/query?mode=web&uuid=default&prompt=&msg=hi HTTP/1.1" 200 -

会是什么问题呢?

Delete Conversation

Now comes a problem, how to delete the conversation and start a new chat?

pip install error

用 GPT-4问了下:

看起来您正在尝试安装已弃用的 azure-storage 软件包,该软件包不再受支持。相反,您应该根据需要安装适当的软件包。根据错误消息,一些常见的替代方案包括:

azure-storage-blob:Blob 存储客户端
azure-storage-file-share:存储文件共享客户端
azure-storage-file-datalake:ADLS Gen2 客户端
azure-storage-queue:队列存储客户端
首先,请根据您的需求选择合适的软件包。然后,在 requirements.txt 文件中用所选软件包替换 azure_storage==0.37.0。例如,如果您需要 Blob 存储客户端,请将其替换为 azure-storage-blob。

完成后,重新运行 pip3 install -r requirements.txt 以安装新的依赖项。

ai\EX-chatGPT> pip3 install -r requirements.txt
Collecting azure_storage==0.37.0
Downloading azure-storage-0.37.0.zip (4.3 kB)
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error

× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "", line 2, in
File "", line 34, in
File "C:\Users\myy\AppData\Local\Temp\pip-install-b7dcmfhu\azure-storage_f3ad491fda9f4dff88388d9fe926a6f4\setup.py", line 55, in
raise RuntimeError(message)
RuntimeError:

  Starting with v0.37.0, the 'azure-storage' meta-package is deprecated and cannot be installed anymore.
  Please install the service specific packages prefixed by `azure` needed for your application.

  The complete list of available packages can be found at:
  https://aka.ms/azsdk/python/all

  Here's a non-exhaustive list of common packages:

  - [azure-storage-blob](https://pypi.org/project/azure-storage-blob) : Blob storage client
  - [azure-storage-file-share](https://pypi.org/project/azure-storage-file-share) : Storage file share client
  - [azure-storage-file-datalake](https://pypi.org/project/azure-storage-file-datalake) : ADLS Gen2 client
  - [azure-storage-queue](https://pypi.org/project/azure-storage-queue): Queue storage client

  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details

上下文记忆

请教一下,虽然实现了调用API的功能,但失去了上下文记忆?变成单次问答了?

可以麻烦作者发布一个docker image 吗? 我无法在服务器复现

错误原因:
Building wheel for tiktoken (pyproject.toml): started #8 14.32 Building wheel for tiktoken (pyproject.toml): finished with status 'error' #8 14.32 error: subprocess-exited-with-error #8 14.32 #8 14.32 × Building wheel for tiktoken (pyproject.toml) did not run successfully. #8 14.32 │ exit code: 1 #8 14.32 ╰─> [37 lines of output] #8 14.32 running bdist_wheel #8 14.32 running build #8 14.32 running build_py #8 14.32 creating build #8 14.32 creating build/lib.linux-aarch64-cpython-311 #8 14.32 creating build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/core.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/__init__.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/model.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/registry.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/load.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 creating build/lib.linux-aarch64-cpython-311/tiktoken_ext #8 14.32 copying tiktoken_ext/openai_public.py -> build/lib.linux-aarch64-cpython-311/tiktoken_ext #8 14.32 running egg_info #8 14.32 writing tiktoken.egg-info/PKG-INFO #8 14.32 writing dependency_links to tiktoken.egg-info/dependency_links.txt #8 14.32 writing requirements to tiktoken.egg-info/requires.txt #8 14.32 writing top-level names to tiktoken.egg-info/top_level.txt #8 14.32 reading manifest file 'tiktoken.egg-info/SOURCES.txt' #8 14.32 reading manifest template 'MANIFEST.in' #8 14.32 warning: no files found matching 'Makefile' #8 14.32 adding license file 'LICENSE' #8 14.32 writing manifest file 'tiktoken.egg-info/SOURCES.txt' #8 14.32 copying tiktoken/py.typed -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 running build_ext #8 14.32 running build_rust #8 14.32 error: can't find Rust compiler #8 14.32 #8 14.32 If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler. #8 14.32 #8 14.32 To update pip, run: #8 14.32 #8 14.32 pip install --upgrade pip #8 14.32 #8 14.32 and then retry package installation. #8 14.32 #8 14.32 If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain. #8 14.32 [end of output] #8 14.32 #8 14.32 note: This error originates from a subprocess, and is likely not a problem with pip. #8 14.32 ERROR: Failed building wheel for tiktoken #8 14.32 Building wheel for jieba (setup.py): started #8 15.79 Building wheel for jieba (setup.py): finished with status 'done' #8 15.80 Created wheel for jieba: filename=jieba-0.42.1-py3-none-any.whl size=19314458 sha256=ae26fd1eb2cce103bb6db861d567811f9ad88e5243f27ae349150607c9f8183c #8 15.80 Stored in directory: /tmp/pip-ephem-wheel-cache-wrid_ysw/wheels/9e/e1/e6/a79f6806b13624bde86455ca13b7ea62f5ab0cd789ba1e8535 #8 15.80 Successfully built regex jieba #8 15.80 Failed to build tiktoken #8 15.80 ERROR: Could not build wheels for tiktoken, which is required to install pyproject.toml-based projects #8 15.88 #8 15.88 [notice] A new release of pip available: 22.3.1 -> 23.0.1 #8 15.88 [notice] To update, run: pip install --upgrade pip

居然不是用的langchain的包?

研究langchain和gpt_index好久了,第一眼看到这个项目,我以为是对langchain的web封装,没想到一细看居然不是,不知道作者对这个库有没有了解,感觉可以结合一下,可以减少重复造轮子的工作。还是说作者已经参考了其设计,只不过把prompt改造了?

不过这个项目确实很厉害,自从看了langchain和new bing以后,一直想着也搞个类似的自用的工具,不过一直没时间实现,看了这个项目,感觉可以直接用了,给作者点赞。感觉可以把长文阅读和pdf阅读的能力也整合进去,直接调用gpt_index查询就行

关于项目和 API 调用的建议

此 issue 的作者希望想仓库贡献代码,此 issue 仅用于确保想要添加的功能是作者希望添加的。不建议看到这个 issue 之后自己 fork 一个主仓库然后就开始实现。

API

  • 支持 SearXNG API(可支持多种搜索引擎,例如 Google, Bing, DDG, Wikipedia, Wikidata, StackOverflow 等同时无需申请 API)
  • 使用翻译 API 预先翻译,或让 (Chat)GPT 自行先翻译(类似 New Bing,网络上英文内容多得多,而且简中互联网错误内容较多) 不清楚完成情况
  • 支持 DALL-E / Codex API(本来 ChatGPT 中就可以使用 Markdown 语法显示图片,考虑再接入 DALL-E,Codex 在代码方面或许更专业)
  • 考虑让 ChatGPT 的回答中不强调“来自 API” 不清楚完成情况

项目

  • 考虑使用 GitHub Actions 自动构建
  • 考虑添加单元测试
  • 命名(Chat/chat,Ex-chatGPT/chatGPTEx,Ehance 的 typo)
  • 文档

docker部署无法访问静态资源问题

  • 已经查看过文档,以及现有的issue,没有类似问题

部署

  • centos, 使用docker部署,使用nginx进行反向代理,实现域名访问

问题

访问后,页面可以打开,但没有UI
CleanShot 2023-04-04 at 16 20 35@2x
尝试进行聊天,按钮点按后无反应,查看console,显示静态资源和ts脚本没有加载成功,提示404
CleanShot 2023-04-04 at 16 21 34@2x

求助一下如何配置可以使这些静态资源可以访问

我怎么部署这个项目

我对这个东西特别感兴趣,但是没有相关任何经验,作者可以回复我一下吗?我可以赞助

chrome插件

老板能不能打包一下你魔改的chrome插件啊,方便一点,自己Build好麻烦的,谢谢
还希望能完善一下readme,怎么使用还是不清楚,谢谢

关于回答中的引用

希望可以在最后回答中加入new bing那样的引用能力,这个我直接和chatgpt测试了下,应该是可行的。
IMG_20230304_190907_033

另外希望可以不要总是说根据来自API的结果,而是直接说来自于什么网站或服务,然后给出引用。

SSL error

环境:wsl2 ubantu-22.04 lts
运行程序时调用proxychains,走socks5通道
Traceback (most recent call last):
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2091, in call
return self.wsgi_app(environ, start_response)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2076, in wsgi_app
response = self.handle_exception(e)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2073, in wsgi_app
response = self.full_dispatch_request()
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1518, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1516, in full_dispatch_request
rv = self.dispatch_request()
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1502, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/home/collins/EX-chatGPT/chatGPTEx/main.py", line 51, in get_bot_response
res = parse_text(web(q,conv_id=uuid))
File "/home/collins/EX-chatGPT/chatGPTEx/search.py", line 58, in web
resp = directQuery(f'Chat History info: {chatbot.conversation[conv_id]}\n Query: {query}', conv_id= conv_id)
File "/home/collins/EX-chatGPT/chatGPTEx/search.py", line 116, in directQuery
response = chatbot.ask(prompt+'\n'+query,convo_id=conv_id)
File "/home/collins/EX-chatGPT/chatGPTEx/optimizeOpenAI.py", line 219, in ask
full_response: str = "".join(response)
File "/home/collins/EX-chatGPT/chatGPTEx/optimizeOpenAI.py", line 165, in ask_stream
response = self.session.post(
File "/home/collins/.local/lib/python3.10/site-packages/requests/sessions.py", line 590, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/home/collins/.local/lib/python3.10/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/home/collins/.local/lib/python3.10/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/home/collins/.local/lib/python3.10/site-packages/requests/adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:997)')))

apike.ini
[Google]
GOOGLE_API_KEY = ***
SEARCH_ENGINE_ID = ***
[OpenAI]
key0 =***
[WolframAlpha]
WOLFRAMALPHA_APP_ID =***
[Proxy]
api_proxy =https://api.openai.com/v1/chat/completions

来自chatgpt的回答:
这个错误通常是由于服务器没有响应或网络连接中断导致的。如果这个错误发生在使用curl访问某个API时,可以尝试以下步骤:

确保目标API服务正在运行,可以尝试在浏览器中打开API地址,看是否可以正常访问。
检查网络连接,确保您的计算机可以连接到目标API服务器。您可以尝试使用ping命令测试与服务器的连接性,如:ping api.example.com。
如果目标API使用HTTPS协议,请确保您的计算机具有适当的SSL证书,否则可能无法建立安全连接。您可以尝试使用curl的--insecure选项禁用SSL证书验证,如:curl --insecure https://api.example.com。
尝试使用其他工具或库(如Postman、requests)访问目标API,以查看是否存在与curl相关的问题。如果其他工具也无法访问API,则问题可能是与API服务有关的。
如果以上步骤都没有解决问题,可以尝试联系API服务提供商获取更多支持。

linux shell command:
proxychains curl chat.openai.com
ProxyChains-3.1 (http://proxychains.sf.net)
|S-chain|-<>-172.18.144.1:58225-<><>-40.90.4.2:80-<><>-OK
curl: (52) Empty reply from server

clash log:
image

我想知道"Google search engine id"的收费情况

"Google search engine id"的收费情况是怎么样的?还有,我在问ChatGPT关于"Google search engine id"的收费情况时,他说:“在使用Google Custom Search Engine API时,每天可以进行100个请求(即每天可以搜索100次)。 如果超过这个限制,则需要支付相应的费用”,这个是真的吗?

search.py文件报错

search.py文件报错
image

浏览器界面显示只有token
image

换成9号及之前的旧版本没有问题。

多轮搜索

有些时候机器人可能需要一轮搜索来明确接下来调用的API,然后才能继续完成请求。现在的机器人可以支持多轮搜索吗?
另外能不能增加让机器人直接访问网页来获取资料的能力?比如第一轮搜索得到了URL1是可能有帮助的,第二轮可以让机器人要求直接访问URL1来获得里面的内容?

无法返回回答

Traceback (most recent call last):
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 2091, in call
return self.wsgi_app(environ, start_response)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 2076, in wsgi_app
response = self.handle_exception(e)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 2073, in wsgi_app
response = self.full_dispatch_request()
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 1518, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 1516, in full_dispatch_request
rv = self.dispatch_request()
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 1502, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/main.py", line 34, in get_bot_response
res = parse_text(web(q,conv_id=uuid))
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/search.py", line 76, in web
apir = APIQuery(query,resp=resp)
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/search.py", line 151, in APIQuery
response = chatbot.ask(prompt,convo_id='api')
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/optimizeOpenAI.py", line 206, in ask
full_response: str = "".join(response)
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/optimizeOpenAI.py", line 154, in ask_stream
response = self.session.post(
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/requests/sessions.py", line 590, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/requests/adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1129)')))

Request for support of Bing Search API

A number of local developers may have greater familiarity with Azure services, and some users may prefer Bing search results due to its tendency to provide direct answers to queries as compared to Google. Hence, would it be possible for you to add support for the Bing Search API?

Thank you for your time and consideration!

可以做一下用户认证吗?

挂vps不用专门处理代理比较方便,但是裸着没有用户认证很危险。

虽然现在也能用nginx实现,能做一下就更好了。

关于服务器部署手动配置(非docker)

如果出现后台显示部署成功,且防火墙相关设置已经配好的情况下无法ip访问网站,可以将main.py这个文件下主函数中的第一个含127.0.0.1的代码段注释掉,使用0.0.0.0的那个代码段

项目部署出错

个人自我评价:小白+
经过十几小时的研究,大佬的项目依旧不能在我本地端成功使用,故而希望得到帮助。
目前问题出现环节:成功进入界面,但chat回复文字为空,只有cost。
部署环境:CentOS 7.9.2009+docker,另外这个系统平时也在跑宝塔。
后台日志:
image
分析可能的原因:配置文件不正确。当前的apikey.ini如下,字符以处理。
image
已经尝试解决方式:
1,今天看到大佬更新了docker-compose.yaml,里面多了环境变量预设,我对项目了解不深入,不知道这里面的key:value是否需要对应apikey.ini中的key:value,所以尝试填写了一下。改动变为下图,问题没有得到解决,
image
2,百度找到了类似问题的解决方案,得到原因是因为urllib3包版本太高的原因,尝试在requirements.txt中添加urllib3==1.25.11,重新拉取镜像,问题依旧。
另:大佬readme中的命令我这边本地没办法适用,后台日志报权限问题,我都是为容器挂载了一个虚拟存储才能打开127.0.0.1:5000。

猜测:可能还是apikey.ini和docker-compose.yaml发生的问题,如果是的话,能否请大佬为我讲解一下这其中的关系?不胜感激。

proxy issue

环境:wsl2 ubantu-22.04 lts
运行程序时调用proxychains,走socks5通道
Traceback (most recent call last):
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2091, in call
return self.wsgi_app(environ, start_response)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2076, in wsgi_app
response = self.handle_exception(e)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2073, in wsgi_app
response = self.full_dispatch_request()
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1518, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1516, in full_dispatch_request
rv = self.dispatch_request()
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1502, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/home/collins/EX-chatGPT/chatGPTEx/main.py", line 55, in get_bot_response
res = parse_text(detail(q,conv_id=uuid))
File "/home/collins/EX-chatGPT/chatGPTEx/search.py", line 46, in detail
call_res0 = search(APIQuery(query),1000)
File "/home/collins/EX-chatGPT/chatGPTEx/search.py", line 130, in APIQuery
response = chatbot.ask(prompt,convo_id='api')
File "/home/collins/EX-chatGPT/chatGPTEx/optimizeOpenAI.py", line 219, in ask
full_response: str = "".join(response)
File "/home/collins/EX-chatGPT/chatGPTEx/optimizeOpenAI.py", line 163, in ask_stream
API_PROXY = str(config['Proxy']['api_proxy'])
File "/usr/lib/python3.10/configparser.py", line 964, in getitem
raise KeyError(key)
KeyError: 'Proxy'

来自chatgpt的回答:
这个错误提示是在运行 Python 代码时出现的,它告诉我们程序在解析配置文件时出现了错误,因为没有找到名为 "Proxy" 的关键字。

具体来说,程序中使用了 Python 标准库中的 ConfigParser 模块来读取配置文件,但是在配置文件中没有找到名为 "Proxy" 的关键字。可能的原因是配置文件格式不正确,或者缺少了必要的配置项。

解决此错误的方法包括:

检查程序中使用的配置文件路径是否正确。
检查配置文件中是否有名为 "Proxy" 的关键字,并且确保它的格式正确。
如果程序依赖于外部配置文件,可以尝试重新下载或者更新配置文件。
希望这些提示可以帮助您解决问题。

token的使用量很夸张

配置如下,请问一下token的消耗很离谱,随便一句话就是上千的token,这个该怎么调试,用的docker安装的

装在洛杉矶的vps
Snipaste_2023-03-24_13-45-51

Snipaste_2023-03-24_13-42-34
Snipaste_2023-03-24_13-42-51

Dictionary Changed Size Duration Iteration

我在使用web模式时卡死,日志中出现如下报错

Chat history backup completed: chathistory_backup_20230323-1027.json
API calls: {'calls': [{'API': 'Google', 'query': '苏剑林 科学空间 LLMDecoder-only的架构 文章'}, {'API': 'WikiSearch', 'query': '苏剑林'}, {'API': 'Google', 'query': 'LLMDecoder-only的架构是什么?'}]}

Debugging middleware caught exception in streamed response at a point where response headers were already sent.
Traceback (most recent call last):
  File "/app/chatGPTEx/search.py", line 82, in detail
    call_res1 = search(APIExtraQuery(query,call_res0),1000)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/chatGPTEx/search.py", line 270, in search
    for key,value in call_res.items():
RuntimeError: dictionary changed size during iteration
127.0.0.1 - - [23/Mar/2023 10:27:00] "GET /api/query?mode=detail&uuid=da98b1a1-8719-4b44-89b3-14044dfc1145&prompt=&msg=给我介绍一下苏剑林在科学空间中发的《为什么现在的LLM都是Decoder-only的架构?》这篇文章 HTTP/1.1" 200 -
Chat history backup completed: chathistory_backup_20230323-1027.json
127.0.0.1 - - [23/Mar/2023 10:27:00] "POST /api/addChat HTTP/1.1" 200 -

无法安装azure-storage

(base) he_muling@MacBook-Pro EX-chatGPT % pip install -r requirements.txt
Collecting azure_storage==0.37.0
  Using cached azure-storage-0.37.0.zip (4.3 kB)
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [20 lines of output]
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/private/var/folders/4z/kspnn2f503550_nfygz4qbr00000gn/T/pip-install-c52stlm7/azure-storage_b80c1df728eb408b8e92646cdd6a0b69/setup.py", line 55, in <module>
          raise RuntimeError(message)
      RuntimeError:
      
      Starting with v0.37.0, the 'azure-storage' meta-package is deprecated and cannot be installed anymore.
      Please install the service specific packages prefixed by `azure` needed for your application.
      
      The complete list of available packages can be found at:
      https://aka.ms/azsdk/python/all
      
      Here's a non-exhaustive list of common packages:
      
      - [azure-storage-blob](https://pypi.org/project/azure-storage-blob) : Blob storage client
      - [azure-storage-file-share](https://pypi.org/project/azure-storage-file-share) : Storage file share client
      - [azure-storage-file-datalake](https://pypi.org/project/azure-storage-file-datalake) : ADLS Gen2 client
      - [azure-storage-queue](https://pypi.org/project/azure-storage-queue): Queue storage client
      
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.