GithubHelp home page GithubHelp logo

zjunlp / easyinstruct Goto Github PK

View Code? Open in Web Editor NEW
306.0 8.0 28.0 18.82 MB

An Easy-to-use Instruction Processing Framework for LLMs.

Home Page: https://zjunlp.github.io/project/EasyInstruct

License: MIT License

Python 96.47% Shell 0.64% Jupyter Notebook 2.90%
gpt-3 instructions prompt api chain-of-thought gpt large-language-models multimodal reasoning in-context-learning

easyinstruct's People

Contributors

bizhen46766 avatar coderxyd avatar eltociear avatar flow3rdown avatar gooodte avatar guihonghao avatar njcx-ai avatar oe-heart avatar shengyumao avatar tubg avatar wakawaka111 avatar xzwyyd avatar zxlzr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

easyinstruct's Issues

问题:1)KG2Instruct方法输出是信息抽取后的三元组如何应用?2)Evol-Instruct方法中Elimination Evolving过滤指令的代码是否可以公开?

1、目前从代码看,KG2Instruct方法输出是信息抽取后的三元组,从三元组如何产生指令微调训练样本,能否可以详细描述,并开放相关代码?
2、Evol-Instruct方法中Elimination Evolving过滤指令原则有:
1)与原始指令相比,进化后的指令没有提供任何信息增益;
2)进化出的指令使得LLMs很难产生回复;
......
请问这些过滤指令的原则具体是如何实现的?是否可以详细描述下,或公开相关代码?

谢谢!期盼回复~

ERROR: Failed building wheel for llama-cpp-python

Building wheel for llama-cpp-python (pyproject.toml) ... error
error: subprocess-exited-with-error

Not searching for unused variables given on the command line.
-- The C compiler identification is unknown
CMake Error at CMakeLists.txt:3 (ENABLE_LANGUAGE):
No CMAKE_C_COMPILER could be found.

    Tell CMake where to find the compiler by setting either the environment
    variable "CC" or the CMake cache entry CMAKE_C_COMPILER to the full path to
    the compiler, or to the compiler name if it is in the PATH.


  -- Configuring incomplete, errors occurred!

ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

无法读取 easyinstruct文件夹下面的engines文件夹

使用setup.py安装
import easyinstruct 无法读取 easyinstruct文件夹下面的engines文件夹,报错:

import easyinstruct
Traceback (most recent call last):
File "", line 1, in
File "d:\tianchi\easyinstruct\easyinstruct_init_.py", line 1, in
from .prompts import *
File "d:\tianchi\easyinstruct\easyinstruct\prompts_init_.py", line 1, in
from .base_prompt import BasePrompt
File "d:\tianchi\easyinstruct\easyinstruct\prompts\base_prompt.py", line 7, in
from engines import llama_engine
ModuleNotFoundError: No module named 'engines'

Migration of OpenAI API

OpenAI API has been updated to version 1. openai.ChatCompletion module is no longer supported.

In base_prompt.py,

response = openai.ChatCompletion.create(
                model=engine,
                messages=messages,
                temperature=temperature,
                max_tokens=max_tokens,
                top_p=top_p,
                n=n,
                frequency_penalty=frequency_penalty,
                presence_penalty=presence_penalty,
            )

should be changed to

response = client.chat.completions.create(
                model=engine,
                messages=messages,
                temperature=temperature,
                max_tokens=max_tokens,
                top_p=top_p,
                n=n,
                frequency_penalty=frequency_penalty,
                presence_penalty=presence_penalty,
            )

ModuleNotFoundError: No module named 'easyinstruct.utils'

在运行example/llm/run.py文件时,在已经安装easyinstruct==0.0.3的情况下,发生报错:
Traceback (most recent call last):
File "/home/ubuntu/MyFiles/DeepKE-main/example/llm/run.py",line 6,in
from easyinstruct.prompts import IEPrompt
File "/home/ubuntu/miniconda3/envs/py39-cul16/lib/python3.9/site-packages/easyinstruct/init.py",line 1,in
from .prompts import
File "/home/ubuntu/miniconda3/envs/py39-cull6/lib/python3.9/site-packages/easyinstruct/prompts/init.py",line 1,in
from base prompt import BasePrompt
File "/home/ubuntu/miniconda3/envs/py39-cul16/lib/python3.9/site-packages/easyinstruct/prompts/base prompt.py",line 5,in
from easyinstruct.utils.api import API_NAME_DICT
ModuleNotFoundError: No module named 'easyinstruct.utils'
Process finished with exit code 1

请问这个该如何解决呢?

Update Anthropic Client

Anthropic changed their python sdk - making this code line outdated.

client = anthropic.Client(get_anthropic_key())

Would love to know if this might help - https://github.com/BerriAI/litellm

~Simple I/O library, that standardizes all the llm api calls to the OpenAI call

from litellm import completion

## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["ANTHROPIC_API_KEY"] = "anthropic key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# anthropic call
response = completion("claude-v-2", messages)

Has the code related to CIRS integrated into this project?

I just read the paper When Do Program-of-Thought Works for Reasoning. The result is pretty interesting and I would like to use CIRS metric to test with some other public code datasets. It is said in the paper that the code will be integrated into this project so I come to ask for help. Thank you :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.