GithubHelp home page GithubHelp logo

Comments (13)

xwu99 avatar xwu99 commented on September 14, 2024

could you add -f https://developer.intel.com/ipex-whl-stable-cpu to pip install?

from llm-on-ray.

nkanike07 avatar nkanike07 commented on September 14, 2024

could you add -f https://developer.intel.com/ipex-whl-stable-cpu to pip install?

I followed the instructions and executed below command. I see its already added.
pip install .[cpu] -f https://developer.intel.com/ipex-whl-stable-cpu -f https://download.pytorch.org/whl/torch_stable.html

from llm-on-ray.

xwu99 avatar xwu99 commented on September 14, 2024

We did not verify or support the package for Windows. Could you try it in Linux?

from llm-on-ray.

jiafuzha avatar jiafuzha commented on September 14, 2024

@nkanike07 could you please check your pip version by 'pip -V'? Please let me know what you get.

You should get similar request as below.

pip 20.2.4 from /usr/lib/python3.9/site-packages/pip (python 3.9)

from llm-on-ray.

nkanike07 avatar nkanike07 commented on September 14, 2024

@nkanike07 could you please check your pip version by 'pip -V'? Please let me know what you get.

You should get similar request as below.

pip 20.2.4 from /usr/lib/python3.9/site-packages/pip (python 3.9)

image

from llm-on-ray.

nkanike07 avatar nkanike07 commented on September 14, 2024

@nkanike07 could you please check your pip version by 'pip -V'? Please let me know what you get.

You should get similar request as below.

pip 20.2.4 from /usr/lib/python3.9/site-packages/pip (python 3.9)

I downgraded my pip to 20.2.4 and tried to install dependencies but faced same issue again. FYR, screenshot below

image

from llm-on-ray.

jiafuzha avatar jiafuzha commented on September 14, 2024

@nkanike07 could you please check your pip version by 'pip -V'? Please let me know what you get.
You should get similar request as below.
pip 20.2.4 from /usr/lib/python3.9/site-packages/pip (python 3.9)

image

I tried to upgrade my pip to your version, but still cannot reproduce your issue. Is it possible for you to use install conda and create an empty conda environment to install llm-on-ray? Because there might be an existing package in your env has conflict with our llm-on-ray packages.

from llm-on-ray.

nkanike07 avatar nkanike07 commented on September 14, 2024

@nkanike07 could you please check your pip version by 'pip -V'? Please let me know what you get.
You should get similar request as below.
pip 20.2.4 from /usr/lib/python3.9/site-packages/pip (python 3.9)

image

I tried to upgrade my pip to your version, but still cannot reproduce your issue. Is it possible for you to use install conda and create an empty conda environment to install llm-on-ray? Because there might be an existing package in your env has conflict with our llm-on-ray packages.

sure, will give a try and update same here

from llm-on-ray.

jiafuzha avatar jiafuzha commented on September 14, 2024

@nkanike07 I just reproduced the issue in Windows. The issue is caused by ipex (intel extension for pytorch) doesn't support windows. So, please switch to linux system. thanks for reporting the issue.

from llm-on-ray.

nkanike07 avatar nkanike07 commented on September 14, 2024

I tried downloading a ubuntu docker image and could successfully run all dependencies in that.

I'm currently facing one issue while running the inference.

When ran below inference command
$python inference/serve.py --config_file inference/models/gpt2.yaml --simple

Getting below missing module error

image

ModuleNotFoundError: No module named 'inference.api_openai_backend'

Could someone help me here?

from llm-on-ray.

jiafuzha avatar jiafuzha commented on September 14, 2024

@nkanike07 We are working on it. Will update you soon.

from llm-on-ray.

jiafuzha avatar jiafuzha commented on September 14, 2024

@nkanike07 Unfortunately, we cannot reproduce your issue. Let's have a talk in Teams.

from llm-on-ray.

jiafuzha avatar jiafuzha commented on September 14, 2024

@nkanike07 I just fixed the issue and merged to the main branch. Please check out the latest code and try again.
The root cause is pip and setuptools perform different between bare-metal and container. I adjusted the way setuptools finds project files. As verified, it worked for both bare-metal and container.
thanks for your reporting.

from llm-on-ray.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.