GithubHelp home page GithubHelp logo

circlestarzero / ex-chatgpt Goto Github PK

View Code? Open in Web Editor NEW
2.0K 13.0 331.0 11.96 MB

Let ChatGPT truly learn how to go online and call APIs! 'EX-ChatGPT' can rival and even surpass NewBing

License: MIT License

TypeScript 34.21% Python 35.63% JavaScript 9.72% HTML 11.39% CSS 6.99% Dockerfile 0.83% Shell 1.23%

ex-chatgpt's Introduction

Ex-ChatGPT - ChatGPT with ToolFormer

language GitHub GitHub last commit GitHub Repo stars

简体中文 English / Background

ChatGPT 是一个强大的工具平台,可以无需任何调整就生成 API 请求来协助回答问题。Ex-ChatGPT 使得 ChatGPT 能够调用外部 API,例如 WolframAlpha、Google 和 WikiMedia,以提供更准确和及时的答案。

这个项目分为 Ex-ChatGPTWebChatGPTEnhance 两部分。前者是一个使用了 GPT3.5 Turbo APIWolframAlpha、Google 和 WikiMedia 等 API 的服务,能够提供更强大的功能和更准确的答案。后者是一个浏览器扩展程序,它更新了原有的 WebChatGPT 插件以支持添加外部 API,支持 ChatGPT 网页调用不同的 API 和提示。

交互界面

ExChatGPT

chatHistory

WebChatGPTEnhance

WebChatGPT

Highlights

  • OAuth2.0多用户鉴权管理 (见webTest分支)
  • 语音对话功能,使用微软 Azure API,优化响应速度 ( 1-2 秒左右 ) ,包含语音识别和文字转语音,支持多种音色和语言,自定义声音。
  • docker 和 proxy 支持
  • 聊天记录冗余备份
  • 支持 OpenAI GPT-3.5 Turbo API
  • 允许 ChatGPT 调用外部 API 接口 ( Google,WolframAlpha,WikiMedia )
  • 对 Google 搜索结果进行数据清洗, 减少token占用
  • 自动保存载入对话历史,自动压缩对话
  • 可显示使用的 Token 数量
  • API池, API 冷却
  • Markdown and MathJax 渲染
  • 调用API 过程显示动画, 类似必应
  • 历史对话管理载入,类 chatgpt 页面布局
  • 快捷键快速选择模式 Tab 和换行 Shift+Enter,Enter 发送, up,down 选择历史发送消息,类似终端
  • stream 特性,它类似于打字机的效果,可以更快地响应结果。与一次性加载所有内容不同,stream会逐步输出结果。如示例中所示: stream
  • chat 模式下prompt 自动补全选择,支持模糊搜索, 拼音搜索, 支持自定义 prompt, 项目中自带 awesome-chatgpt-prompts 中的 prompt promptCompletion

计划更新

  • 移动端界面适配
  • 发送图片OCR识别公式文字
  • OAuth2.0多用户鉴权 (见webTest分支)
  • 调用diffusing model生成图片(达到类似多模态效果)
  • 网页搜索结果进一步爬虫总结清洗数据
  • 增加代码运行API,以及更多API
  • 聊天记录/本地知识数据库embedding对齐检索

安装

Ex-chatGPT Installation

  • pip install pip install -r requirements.txt
  • apikey.ini.example 复制改名为 apikey.ini,然后在 apikey.ini 中填入你的 API 密钥, 以及代理 ( 如果只有一个 openAIAPI key,将 key1 = sk-xxxx; key2 = sk-xxxx 删除即可 )
  • Google api key and search engine id 申请
  • wolframAlpha app id key 申请
  • openAI api key( 新功能 ) 或 chatGPT access_token ( 旧版本 ) 申请
  • (可选) 在 apikey.ini 中填写Azure API keyregion 申请
  • 运行 main.py 并打开 http://127.0.0.1:1234/
  • 选择模式 ( 可以使用 Tab ) ,例如 chat,detail,web,webDirect,WebKeyWord
  • chat 模式下 使用 \{promptname} {query} 格式来模糊搜索选择 prompt
  • 快捷键快速选择模式 Tab 和换行 Shift+Enter,Enter 发送, up,down 选择历史发送消息,类似终端
  • 语音对话聊天(可选功能), 在 chatGPTEx/static/styles/tts.js 中选择语言和音色, 在聊天界面中点击麦克风启动/关闭对话模式

Docker 快速部署

方法一 使用构建好的镜像
  1. 创建配置文件目录并拉取配置文件

    mkdir config && wget https://raw.githubusercontent.com/circlestarzero/EX-chatGPT/main/chatGPTEx/apikey.ini.example -O ./config/apikey.ini

  2. 编辑配置文件或者把编辑好的配置文件传到config文件夹下。

    vim ./config/apikey.ini

  3. 拉取docker镜像

    docker pull 0nlylty/exchatgpt:latest

  4. 创建容器

    docker run -dit \
      -v ~/config:/config \
      -p 5000:5000 \
      --name exchatgpt \
      --restart unless-stopped \
     0nlylty/exchatgpt:latest
方法二 自己构建镜像
  1. 创建配置文件目录并拉取配置文件

    mkdir config && wget https://raw.githubusercontent.com/circlestarzero/EX-chatGPT/main/chatGPTEx/apikey.ini.example -O ./config/apikey.ini

  2. 编辑配置文件或者把编辑好的配置文件传到config文件夹下。

    vim ./config/apikey.ini

  3. 构建并运行

    # 克隆代码
    git clone https://github.com/circlestarzero/EX-chatGPT.git --depth=1
    # 进入项目目录
    cd EX-chatGPT/chatGPTEx
    # 编辑docker-compose.yaml的挂载路径
    ~/config:/config   # 冒号左边请修改为保存配置的路径
    # 配置补充完整后启动
    docker compose up -d
    
使用
# 访问
http://your_ip:5000

# 查看日志
docker logs -f --tail 100 exchatgpt

WebChatGPTEnhance Installation

  • chatGPTChromeEhance/src/util/apiManager.ts/getDefaultAPI 中填入 Google API 信息
  • 运行 npm install
  • 运行 npm run build-prod
  • chatGPTChromeEhance/build 中获取构建好的扩展
  • add your prompts and APIs in option page.
  • APIs and prompts examples are in /WebChatGPTAPI
  • wolframAlpha needs to run local sever - WebChatGPTAPI/WolframLocalServer.py

模式介绍

Web Mode

Web Mode 开始时会直接询问 ChatGPT 一个问题。ChatGPT 会生成一系列与查询相关的 API 调用,并使用第一个返回的结果和问题进行验证和补充。最后,ChatGPT 会对信息进行总结。Web Mode 具有比仅总结响应更好的聊天能力。

Chat Mode

Chat Mode 仅调用 OpenAI API 接口,类似于 ChatGPT 的 Web 版本。您可以通过输入 /promtname 来搜索和选择不同的提示,它还支持模糊搜索。

WebDirect Mode

WebDirect Mode 首先让 ChatGPT 生成一系列与查询相关的 API 调用。然后,它直接调用第三方 API 搜索每个查询的答案,最后 ChatGPT 对信息进行总结。WebDirect Mode 对于单个查询信息更快且相对更准确。

Detail Mode

Detail Mode 是 WebDirect Mode 的扩展,它会进行额外的 API 调用来补充当前结果中未找到的信息 ( 例如之前未搜索到的信息 ) 。最后,ChatGPT 对信息进行总结。

Keyword Mode

Keyword Mode 直接从 ChatGPT 中生成关键词进行查询,使用 DDG 进行查询,不需要其他 API 密钥。但是其准确性相对较差。

更新日志

  • OAuth2.0多用户鉴权管理 (见webTest分支)
  • 对 Google 搜索结果进行数据清洗, 减少token占用
  • 更新所有API代理池, 增加API限制冷却机制(Google 403 冷却1天)
  • 语音对话功能, 使用微软azureAPI, 优化响应速度, 包含识别语音和文字转语音, 支持多种音色和语言,自定义声音
  • stream 特性,它类似于打字机的效果,可以更快地响应结果。与一次性加载所有内容不同,stream会逐步输出结果。如示例中所示: stream
  • 聊天记录冗余备份
  • chat 模式下 prompt 自动补全选择,支持模糊搜索和拼音搜索

promptCompletion

  • 更新 Docker 和 proxy 支持
  • 支持 OpenAI GPT-3.5 Turbo API,快速且价格低廉
  • 提供额外的 API 调用和搜索摘要,以提供更全面和详细的答案
  • 使用快捷键快速选择模式 Tab 和换行 Shift+Enter,同时使用 Enter 发送消息。使用 updown 选择历史发送消息,类似终端操作
  • 更新历史对话管理,支持载入、删除和保存历史对话

chatHistory

  • 更新 API 调用处理动画

APIAnimation

  • 页面美化

WebBeautification

  • Markdown 和 MathJax 渲染器

MathJax

  • 更新聊天记录 token 优化器,Web 模式可以根据聊天记录进行响应;添加 token 成本计数器

history

  • 更新 Web 聊天模式选择,优化 prompt 和 token 成本,限制 token 上限

mode

  • 改进对中文查询的支持,并添加当前日期信息

date

  • 更新 Web 聊天模式并修复一些错误
  • 更新 API 配置
  • 更新 API 池
  • 自动保存载入对话历史,ChatGPT 可联系之前对话

ex-chatgpt's People

Contributors

acdiost avatar circlestarzero avatar mirokymac avatar onlylty avatar yuzhenqin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

ex-chatgpt's Issues

Delete Conversation

Now comes a problem, how to delete the conversation and start a new chat?

proxy issue

环境:wsl2 ubantu-22.04 lts
运行程序时调用proxychains,走socks5通道
Traceback (most recent call last):
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2091, in call
return self.wsgi_app(environ, start_response)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2076, in wsgi_app
response = self.handle_exception(e)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2073, in wsgi_app
response = self.full_dispatch_request()
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1518, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1516, in full_dispatch_request
rv = self.dispatch_request()
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1502, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/home/collins/EX-chatGPT/chatGPTEx/main.py", line 55, in get_bot_response
res = parse_text(detail(q,conv_id=uuid))
File "/home/collins/EX-chatGPT/chatGPTEx/search.py", line 46, in detail
call_res0 = search(APIQuery(query),1000)
File "/home/collins/EX-chatGPT/chatGPTEx/search.py", line 130, in APIQuery
response = chatbot.ask(prompt,convo_id='api')
File "/home/collins/EX-chatGPT/chatGPTEx/optimizeOpenAI.py", line 219, in ask
full_response: str = "".join(response)
File "/home/collins/EX-chatGPT/chatGPTEx/optimizeOpenAI.py", line 163, in ask_stream
API_PROXY = str(config['Proxy']['api_proxy'])
File "/usr/lib/python3.10/configparser.py", line 964, in getitem
raise KeyError(key)
KeyError: 'Proxy'

来自chatgpt的回答:
这个错误提示是在运行 Python 代码时出现的,它告诉我们程序在解析配置文件时出现了错误,因为没有找到名为 "Proxy" 的关键字。

具体来说,程序中使用了 Python 标准库中的 ConfigParser 模块来读取配置文件,但是在配置文件中没有找到名为 "Proxy" 的关键字。可能的原因是配置文件格式不正确,或者缺少了必要的配置项。

解决此错误的方法包括:

检查程序中使用的配置文件路径是否正确。
检查配置文件中是否有名为 "Proxy" 的关键字,并且确保它的格式正确。
如果程序依赖于外部配置文件,可以尝试重新下载或者更新配置文件。
希望这些提示可以帮助您解决问题。

可以做一下用户认证吗?

挂vps不用专门处理代理比较方便,但是裸着没有用户认证很危险。

虽然现在也能用nginx实现,能做一下就更好了。

search.py文件报错

search.py文件报错
image

浏览器界面显示只有token
image

换成9号及之前的旧版本没有问题。

无法安装azure-storage

(base) he_muling@MacBook-Pro EX-chatGPT % pip install -r requirements.txt
Collecting azure_storage==0.37.0
  Using cached azure-storage-0.37.0.zip (4.3 kB)
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [20 lines of output]
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/private/var/folders/4z/kspnn2f503550_nfygz4qbr00000gn/T/pip-install-c52stlm7/azure-storage_b80c1df728eb408b8e92646cdd6a0b69/setup.py", line 55, in <module>
          raise RuntimeError(message)
      RuntimeError:
      
      Starting with v0.37.0, the 'azure-storage' meta-package is deprecated and cannot be installed anymore.
      Please install the service specific packages prefixed by `azure` needed for your application.
      
      The complete list of available packages can be found at:
      https://aka.ms/azsdk/python/all
      
      Here's a non-exhaustive list of common packages:
      
      - [azure-storage-blob](https://pypi.org/project/azure-storage-blob) : Blob storage client
      - [azure-storage-file-share](https://pypi.org/project/azure-storage-file-share) : Storage file share client
      - [azure-storage-file-datalake](https://pypi.org/project/azure-storage-file-datalake) : ADLS Gen2 client
      - [azure-storage-queue](https://pypi.org/project/azure-storage-queue): Queue storage client
      
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

SSL error

环境:wsl2 ubantu-22.04 lts
运行程序时调用proxychains,走socks5通道
Traceback (most recent call last):
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2091, in call
return self.wsgi_app(environ, start_response)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2076, in wsgi_app
response = self.handle_exception(e)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 2073, in wsgi_app
response = self.full_dispatch_request()
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1518, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1516, in full_dispatch_request
rv = self.dispatch_request()
File "/home/collins/.local/lib/python3.10/site-packages/flask/app.py", line 1502, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/home/collins/EX-chatGPT/chatGPTEx/main.py", line 51, in get_bot_response
res = parse_text(web(q,conv_id=uuid))
File "/home/collins/EX-chatGPT/chatGPTEx/search.py", line 58, in web
resp = directQuery(f'Chat History info: {chatbot.conversation[conv_id]}\n Query: {query}', conv_id= conv_id)
File "/home/collins/EX-chatGPT/chatGPTEx/search.py", line 116, in directQuery
response = chatbot.ask(prompt+'\n'+query,convo_id=conv_id)
File "/home/collins/EX-chatGPT/chatGPTEx/optimizeOpenAI.py", line 219, in ask
full_response: str = "".join(response)
File "/home/collins/EX-chatGPT/chatGPTEx/optimizeOpenAI.py", line 165, in ask_stream
response = self.session.post(
File "/home/collins/.local/lib/python3.10/site-packages/requests/sessions.py", line 590, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/home/collins/.local/lib/python3.10/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/home/collins/.local/lib/python3.10/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/home/collins/.local/lib/python3.10/site-packages/requests/adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by SSLError(SSLZeroReturnError(6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:997)')))

apike.ini
[Google]
GOOGLE_API_KEY = ***
SEARCH_ENGINE_ID = ***
[OpenAI]
key0 =***
[WolframAlpha]
WOLFRAMALPHA_APP_ID =***
[Proxy]
api_proxy =https://api.openai.com/v1/chat/completions

来自chatgpt的回答:
这个错误通常是由于服务器没有响应或网络连接中断导致的。如果这个错误发生在使用curl访问某个API时,可以尝试以下步骤:

确保目标API服务正在运行,可以尝试在浏览器中打开API地址,看是否可以正常访问。
检查网络连接,确保您的计算机可以连接到目标API服务器。您可以尝试使用ping命令测试与服务器的连接性,如:ping api.example.com。
如果目标API使用HTTPS协议,请确保您的计算机具有适当的SSL证书,否则可能无法建立安全连接。您可以尝试使用curl的--insecure选项禁用SSL证书验证,如:curl --insecure https://api.example.com。
尝试使用其他工具或库(如Postman、requests)访问目标API,以查看是否存在与curl相关的问题。如果其他工具也无法访问API,则问题可能是与API服务有关的。
如果以上步骤都没有解决问题,可以尝试联系API服务提供商获取更多支持。

linux shell command:
proxychains curl chat.openai.com
ProxyChains-3.1 (http://proxychains.sf.net)
|S-chain|-<>-172.18.144.1:58225-<><>-40.90.4.2:80-<><>-OK
curl: (52) Empty reply from server

clash log:
image

上下文记忆

请教一下,虽然实现了调用API的功能,但失去了上下文记忆?变成单次问答了?

关于回答中的引用

希望可以在最后回答中加入new bing那样的引用能力,这个我直接和chatgpt测试了下,应该是可行的。
IMG_20230304_190907_033

另外希望可以不要总是说根据来自API的结果,而是直接说来自于什么网站或服务,然后给出引用。

关于项目和 API 调用的建议

此 issue 的作者希望想仓库贡献代码,此 issue 仅用于确保想要添加的功能是作者希望添加的。不建议看到这个 issue 之后自己 fork 一个主仓库然后就开始实现。

API

  • 支持 SearXNG API(可支持多种搜索引擎,例如 Google, Bing, DDG, Wikipedia, Wikidata, StackOverflow 等同时无需申请 API)
  • 使用翻译 API 预先翻译,或让 (Chat)GPT 自行先翻译(类似 New Bing,网络上英文内容多得多,而且简中互联网错误内容较多) 不清楚完成情况
  • 支持 DALL-E / Codex API(本来 ChatGPT 中就可以使用 Markdown 语法显示图片,考虑再接入 DALL-E,Codex 在代码方面或许更专业)
  • 考虑让 ChatGPT 的回答中不强调“来自 API” 不清楚完成情况

项目

  • 考虑使用 GitHub Actions 自动构建
  • 考虑添加单元测试
  • 命名(Chat/chat,Ex-chatGPT/chatGPTEx,Ehance 的 typo)
  • 文档

Dictionary Changed Size Duration Iteration

我在使用web模式时卡死,日志中出现如下报错

Chat history backup completed: chathistory_backup_20230323-1027.json
API calls: {'calls': [{'API': 'Google', 'query': '苏剑林 科学空间 LLMDecoder-only的架构 文章'}, {'API': 'WikiSearch', 'query': '苏剑林'}, {'API': 'Google', 'query': 'LLMDecoder-only的架构是什么?'}]}

Debugging middleware caught exception in streamed response at a point where response headers were already sent.
Traceback (most recent call last):
  File "/app/chatGPTEx/search.py", line 82, in detail
    call_res1 = search(APIExtraQuery(query,call_res0),1000)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/app/chatGPTEx/search.py", line 270, in search
    for key,value in call_res.items():
RuntimeError: dictionary changed size during iteration
127.0.0.1 - - [23/Mar/2023 10:27:00] "GET /api/query?mode=detail&uuid=da98b1a1-8719-4b44-89b3-14044dfc1145&prompt=&msg=给我介绍一下苏剑林在科学空间中发的《为什么现在的LLM都是Decoder-only的架构?》这篇文章 HTTP/1.1" 200 -
Chat history backup completed: chathistory_backup_20230323-1027.json
127.0.0.1 - - [23/Mar/2023 10:27:00] "POST /api/addChat HTTP/1.1" 200 -

docker部署无法访问静态资源问题

  • 已经查看过文档,以及现有的issue,没有类似问题

部署

  • centos, 使用docker部署,使用nginx进行反向代理,实现域名访问

问题

访问后,页面可以打开,但没有UI
CleanShot 2023-04-04 at 16 20 35@2x
尝试进行聊天,按钮点按后无反应,查看console,显示静态资源和ts脚本没有加载成功,提示404
CleanShot 2023-04-04 at 16 21 34@2x

求助一下如何配置可以使这些静态资源可以访问

请求时总是返回 EOF occurred in violation of protocol (_ssl.c:992)

  • aarch64 GNU/Linux 系统架构
  • 挂了代理了,容器内能 curl www.google.com
  • 容器内 urllib3==1.25.11
  • 错误日志如下:
192.168.1.109 - - [18/Mar/2023 18:30:45] "GET /api/APIProcess HTTP/1.1" 200 -

Debugging middleware caught exception in streamed response at a point where response headers were already sent.

Traceback (most recent call last):

  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 670, in urlopen

    httplib_response = self._make_request(

                       ^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 381, in _make_request

    self._validate_conn(conn)

  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 978, in _validate_conn

    conn.connect()

  File "/usr/local/lib/python3.11/site-packages/urllib3/connection.py", line 362, in connect

    self.sock = ssl_wrap_socket(

                ^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/urllib3/util/ssl_.py", line 386, in ssl_wrap_socket

    return context.wrap_socket(sock, server_hostname=server_hostname)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/ssl.py", line 517, in wrap_socket

    return self.sslsocket_class._create(

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/ssl.py", line 1075, in _create

    self.do_handshake()

  File "/usr/local/lib/python3.11/ssl.py", line 1346, in do_handshake

    self._sslobj.do_handshake()

ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:992)


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 439, in send

    resp = conn.urlopen(

           ^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/urllib3/connectionpool.py", line 726, in urlopen

    retries = retries.increment(

              ^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/urllib3/util/retry.py", line 446, in increment

    raise MaxRetryError(_pool, url, error or ResponseError(cause))

urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:992)')))


During handling of the above exception, another exception occurred:


Traceback (most recent call last):

  File "/app/chatGPTEx/search.py", line 93, in web

    resp = directQuery(f'Query: {query}', conv_id=  conv_id)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/chatGPTEx/search.py", line 153, in directQuery

    response = chatbot.ask(prompt+'\n'+query,convo_id=conv_id)

               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/app/chatGPTEx/optimizeOpenAI.py", line 206, in ask

    full_response: str = "".join(response)

                         ^^^^^^^^^^^^^^^^^

  File "/app/chatGPTEx/optimizeOpenAI.py", line 160, in ask_stream

    response = self.session.post(

               ^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 590, in post

    return self.request('POST', url, data=data, json=json, **kwargs)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 542, in request

    resp = self.send(prep, **send_kwargs)

           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/sessions.py", line 655, in send

    r = adapter.send(request, **kwargs)

        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "/usr/local/lib/python3.11/site-packages/requests/adapters.py", line 514, in send

    raise SSLError(e, request=request)

requests.exceptions.SSLError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:992)')))

192.168.1.109 - - [18/Mar/2023 18:30:45] "GET /api/query?mode=web&uuid=default&prompt=&msg=hi HTTP/1.1" 200 -

会是什么问题呢?

多轮搜索

有些时候机器人可能需要一轮搜索来明确接下来调用的API,然后才能继续完成请求。现在的机器人可以支持多轮搜索吗?
另外能不能增加让机器人直接访问网页来获取资料的能力?比如第一轮搜索得到了URL1是可能有帮助的,第二轮可以让机器人要求直接访问URL1来获得里面的内容?

chrome插件

老板能不能打包一下你魔改的chrome插件啊,方便一点,自己Build好麻烦的,谢谢
还希望能完善一下readme,怎么使用还是不清楚,谢谢

居然不是用的langchain的包?

研究langchain和gpt_index好久了,第一眼看到这个项目,我以为是对langchain的web封装,没想到一细看居然不是,不知道作者对这个库有没有了解,感觉可以结合一下,可以减少重复造轮子的工作。还是说作者已经参考了其设计,只不过把prompt改造了?

不过这个项目确实很厉害,自从看了langchain和new bing以后,一直想着也搞个类似的自用的工具,不过一直没时间实现,看了这个项目,感觉可以直接用了,给作者点赞。感觉可以把长文阅读和pdf阅读的能力也整合进去,直接调用gpt_index查询就行

项目部署出错

个人自我评价:小白+
经过十几小时的研究,大佬的项目依旧不能在我本地端成功使用,故而希望得到帮助。
目前问题出现环节:成功进入界面,但chat回复文字为空,只有cost。
部署环境:CentOS 7.9.2009+docker,另外这个系统平时也在跑宝塔。
后台日志:
image
分析可能的原因:配置文件不正确。当前的apikey.ini如下,字符以处理。
image
已经尝试解决方式:
1,今天看到大佬更新了docker-compose.yaml,里面多了环境变量预设,我对项目了解不深入,不知道这里面的key:value是否需要对应apikey.ini中的key:value,所以尝试填写了一下。改动变为下图,问题没有得到解决,
image
2,百度找到了类似问题的解决方案,得到原因是因为urllib3包版本太高的原因,尝试在requirements.txt中添加urllib3==1.25.11,重新拉取镜像,问题依旧。
另:大佬readme中的命令我这边本地没办法适用,后台日志报权限问题,我都是为容器挂载了一个虚拟存储才能打开127.0.0.1:5000。

猜测:可能还是apikey.ini和docker-compose.yaml发生的问题,如果是的话,能否请大佬为我讲解一下这其中的关系?不胜感激。

我怎么部署这个项目

我对这个东西特别感兴趣,但是没有相关任何经验,作者可以回复我一下吗?我可以赞助

pip install error

用 GPT-4问了下:

看起来您正在尝试安装已弃用的 azure-storage 软件包,该软件包不再受支持。相反,您应该根据需要安装适当的软件包。根据错误消息,一些常见的替代方案包括:

azure-storage-blob:Blob 存储客户端
azure-storage-file-share:存储文件共享客户端
azure-storage-file-datalake:ADLS Gen2 客户端
azure-storage-queue:队列存储客户端
首先,请根据您的需求选择合适的软件包。然后,在 requirements.txt 文件中用所选软件包替换 azure_storage==0.37.0。例如,如果您需要 Blob 存储客户端,请将其替换为 azure-storage-blob。

完成后,重新运行 pip3 install -r requirements.txt 以安装新的依赖项。

ai\EX-chatGPT> pip3 install -r requirements.txt
Collecting azure_storage==0.37.0
Downloading azure-storage-0.37.0.zip (4.3 kB)
Preparing metadata (setup.py) ... error
error: subprocess-exited-with-error

× python setup.py egg_info did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "", line 2, in
File "", line 34, in
File "C:\Users\myy\AppData\Local\Temp\pip-install-b7dcmfhu\azure-storage_f3ad491fda9f4dff88388d9fe926a6f4\setup.py", line 55, in
raise RuntimeError(message)
RuntimeError:

  Starting with v0.37.0, the 'azure-storage' meta-package is deprecated and cannot be installed anymore.
  Please install the service specific packages prefixed by `azure` needed for your application.

  The complete list of available packages can be found at:
  https://aka.ms/azsdk/python/all

  Here's a non-exhaustive list of common packages:

  - [azure-storage-blob](https://pypi.org/project/azure-storage-blob) : Blob storage client
  - [azure-storage-file-share](https://pypi.org/project/azure-storage-file-share) : Storage file share client
  - [azure-storage-file-datalake](https://pypi.org/project/azure-storage-file-datalake) : ADLS Gen2 client
  - [azure-storage-queue](https://pypi.org/project/azure-storage-queue): Queue storage client

  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details

关于服务器部署手动配置(非docker)

如果出现后台显示部署成功,且防火墙相关设置已经配好的情况下无法ip访问网站,可以将main.py这个文件下主函数中的第一个含127.0.0.1的代码段注释掉,使用0.0.0.0的那个代码段

我想知道"Google search engine id"的收费情况

"Google search engine id"的收费情况是怎么样的?还有,我在问ChatGPT关于"Google search engine id"的收费情况时,他说:“在使用Google Custom Search Engine API时,每天可以进行100个请求(即每天可以搜索100次)。 如果超过这个限制,则需要支付相应的费用”,这个是真的吗?

无法返回回答

Traceback (most recent call last):
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 2091, in call
return self.wsgi_app(environ, start_response)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 2076, in wsgi_app
response = self.handle_exception(e)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 2073, in wsgi_app
response = self.full_dispatch_request()
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 1518, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 1516, in full_dispatch_request
rv = self.dispatch_request()
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/flask/app.py", line 1502, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args)
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/main.py", line 34, in get_bot_response
res = parse_text(web(q,conv_id=uuid))
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/search.py", line 76, in web
apir = APIQuery(query,resp=resp)
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/search.py", line 151, in APIQuery
response = chatbot.ask(prompt,convo_id='api')
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/optimizeOpenAI.py", line 206, in ask
full_response: str = "".join(response)
File "/Users/zhangmeng/Downloads/EX-chatGPT-main/chatGPTEx/optimizeOpenAI.py", line 154, in ask_stream
response = self.session.post(
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/requests/sessions.py", line 590, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/Users/zhangmeng/opt/anaconda3/lib/python3.9/site-packages/requests/adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='api.openai.com', port=443): Max retries exceeded with url: /v1/chat/completions (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1129)')))

Request for support of Bing Search API

A number of local developers may have greater familiarity with Azure services, and some users may prefer Bing search results due to its tendency to provide direct answers to queries as compared to Google. Hence, would it be possible for you to add support for the Bing Search API?

Thank you for your time and consideration!

可以麻烦作者发布一个docker image 吗? 我无法在服务器复现

错误原因:
Building wheel for tiktoken (pyproject.toml): started #8 14.32 Building wheel for tiktoken (pyproject.toml): finished with status 'error' #8 14.32 error: subprocess-exited-with-error #8 14.32 #8 14.32 × Building wheel for tiktoken (pyproject.toml) did not run successfully. #8 14.32 │ exit code: 1 #8 14.32 ╰─> [37 lines of output] #8 14.32 running bdist_wheel #8 14.32 running build #8 14.32 running build_py #8 14.32 creating build #8 14.32 creating build/lib.linux-aarch64-cpython-311 #8 14.32 creating build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/core.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/__init__.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/model.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/registry.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 copying tiktoken/load.py -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 creating build/lib.linux-aarch64-cpython-311/tiktoken_ext #8 14.32 copying tiktoken_ext/openai_public.py -> build/lib.linux-aarch64-cpython-311/tiktoken_ext #8 14.32 running egg_info #8 14.32 writing tiktoken.egg-info/PKG-INFO #8 14.32 writing dependency_links to tiktoken.egg-info/dependency_links.txt #8 14.32 writing requirements to tiktoken.egg-info/requires.txt #8 14.32 writing top-level names to tiktoken.egg-info/top_level.txt #8 14.32 reading manifest file 'tiktoken.egg-info/SOURCES.txt' #8 14.32 reading manifest template 'MANIFEST.in' #8 14.32 warning: no files found matching 'Makefile' #8 14.32 adding license file 'LICENSE' #8 14.32 writing manifest file 'tiktoken.egg-info/SOURCES.txt' #8 14.32 copying tiktoken/py.typed -> build/lib.linux-aarch64-cpython-311/tiktoken #8 14.32 running build_ext #8 14.32 running build_rust #8 14.32 error: can't find Rust compiler #8 14.32 #8 14.32 If you are using an outdated pip version, it is possible a prebuilt wheel is available for this package but pip is not able to install from it. Installing from the wheel would avoid the need for a Rust compiler. #8 14.32 #8 14.32 To update pip, run: #8 14.32 #8 14.32 pip install --upgrade pip #8 14.32 #8 14.32 and then retry package installation. #8 14.32 #8 14.32 If you did intend to build this package from source, try installing a Rust compiler from your system package manager and ensure it is on the PATH during installation. Alternatively, rustup (available at https://rustup.rs) is the recommended way to download and update the Rust compiler toolchain. #8 14.32 [end of output] #8 14.32 #8 14.32 note: This error originates from a subprocess, and is likely not a problem with pip. #8 14.32 ERROR: Failed building wheel for tiktoken #8 14.32 Building wheel for jieba (setup.py): started #8 15.79 Building wheel for jieba (setup.py): finished with status 'done' #8 15.80 Created wheel for jieba: filename=jieba-0.42.1-py3-none-any.whl size=19314458 sha256=ae26fd1eb2cce103bb6db861d567811f9ad88e5243f27ae349150607c9f8183c #8 15.80 Stored in directory: /tmp/pip-ephem-wheel-cache-wrid_ysw/wheels/9e/e1/e6/a79f6806b13624bde86455ca13b7ea62f5ab0cd789ba1e8535 #8 15.80 Successfully built regex jieba #8 15.80 Failed to build tiktoken #8 15.80 ERROR: Could not build wheels for tiktoken, which is required to install pyproject.toml-based projects #8 15.88 #8 15.88 [notice] A new release of pip available: 22.3.1 -> 23.0.1 #8 15.88 [notice] To update, run: pip install --upgrade pip

token的使用量很夸张

配置如下,请问一下token的消耗很离谱,随便一句话就是上千的token,这个该怎么调试,用的docker安装的

装在洛杉矶的vps
Snipaste_2023-03-24_13-45-51

Snipaste_2023-03-24_13-42-34
Snipaste_2023-03-24_13-42-51

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.