Comments (5)
@mrwyattii could you help looking into this? Many thanks!
from deepspeed-mii.
Hi @zzz0906 I'm unable to reproduce this error on my side. I'm using [email protected] and [email protected]
Could you help me by running the following python script? This will help me understand whether it's a problem with DeepSpeed-MII or with the benchmark code.
import mii
client = mii.serve("NousResearch/Llama-2-13b-hf")
r = client("test input", max_new_tokens=128)
print(r)
from deepspeed-mii.
Hi @zzz0906 I'm unable to reproduce this error on my side. I'm using [email protected] and [email protected]
Could you help me by running the following python script? This will help me understand whether it's a problem with DeepSpeed-MII or with the benchmark code.
import mii client = mii.serve("NousResearch/Llama-2-13b-hf") r = client("test input", max_new_tokens=128) print(r)
Sure when I run the second line > r = client("test input", max_new_tokens=128) I got an error of:
>>> r = client("test input", max_new_tokens=128)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/lus/xxxx/projects/xxx/xxx/DeepSpeed-MII-internal/mii/backend/client.py", line 42, in __call__
return self.generate(*args, **kwargs)
File "/lus/xxxx/projects/xxx/xxx/DeepSpeed-MII-internal/mii/backend/client.py", line 74, in generate
return self.asyncio_loop.run_until_complete(
File "/soft/conda/2022-09-08/mconda3/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
return future.result()
File "/lus/xxxx/projects/xxx/xxx/DeepSpeed-MII-internal/mii/backend/client.py", line 47, in _request_async_response
proto_response = await getattr(self.stub, task_methods.method)(proto_request)
File "/xxxx/xxx/xxx/xxx/xxx/lib/python3.8/site-packages/grpc/aio/_call.py", line 318, in __await__
raise _create_rpc_error(
grpc.aio._call.AioRpcError: <AioRpcError of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "failed to connect to all addresses; last error: UNKNOWN: ipv4:140.221.69.42:3128: HTTP proxy returned response code 403"
debug_error_string = "UNKNOWN:Error received from peer {created_time:"2024-02-28T20:12:18.414690102+00:00", grpc_status:14, grpc_message:"failed to connect to all addresses; last error: UNKNOWN: ipv4:140.221.69.42:3128: HTTP proxy returned response code 403"}"
from deepspeed-mii.
Thanks @zzz0906. Because you're not able to able to run even a simple example, can you share some information about your python environment? Please run the following and share the output: pip list | grep "asyncio\|grpc\|deepspeed\|torch"
Can you also ensure that no orphaned MII processes are running in the background with ps aux | grep "mii"
. When you run this, it should only show a single line like this:
michael+ 2763709 0.0 0.0 6440 2588 pts/3 S+ 15:49 0:00 grep --color=auto --exclude-dir=.bzr --exclude-dir=CVS --exclude-dir=.git --exclude-dir=.hg --exclude-dir=.svn --exclude-dir=.idea --exclude-dir=.tox mii
from deepspeed-mii.
Thanks @zzz0906. Because you're not able to able to run even a simple example, can you share some information about your python environment? Please run the following and share the output:
pip list | grep "asyncio\|grpc\|deepspeed\|torch"
Can you also ensure that no orphaned MII processes are running in the background with
ps aux | grep "mii"
. When you run this, it should only show a single line like this:michael+ 2763709 0.0 0.0 6440 2588 pts/3 S+ 15:49 0:00 grep --color=auto --exclude-dir=.bzr --exclude-dir=CVS --exclude-dir=.git --exclude-dir=.hg --exclude-dir=.svn --exclude-dir=.idea --exclude-dir=.tox mii
Sure,
- pip list | grep "asyncio|grpc|deepspeed|torch"
asyncio 3.4.3
deepspeed 0.13.2+83eb2201 /lus/grand/projects/DeepSpeed-internal
deepspeed-kernels 0.0.1.dev1698255861
deepspeed-mii 0.2.1+4cc5d90 /lus/grand/projects/DeepSpeed-MII-internal
functorch 0.3.0a0+091d999
gpytorch 1.9.0
grpcio 1.62.0
grpcio-tools 1.62.0
nest-asyncio 1.5.5
pytorch-lightning 1.7.5
qtorch 0.3.0
torch 1.12.0a0+git664058f
torch-geometric 2.1.0.post1
torch-scatter 2.0.9
torch-sparse 0.6.15
torch-tb-profiler 0.4.0
torchinfo 1.7.0
torchmetrics 0.9.3
torchvision 0.13.0a0+da3794e
torchviz 0.0.2
- Yeah, when I run it I found there are orphaned MII processes are running. I killed them and rerun this
ps aux | grep "mii"
mic+ 19204 0.0 0.0 8216 804 pts/0 S+ 01:41 0:00 grep --color=auto mii
import mii
client = mii.serve("NousResearch/Llama-2-13b-hf")
r = client("test input", max_new_tokens=128)
print(r)
got the following output now:
>>> r = client("test input", max_new_tokens=128)
>>> print(r)
[and output files at the shell prompt.
"ERROR", // system error occurred.
"FATAL", // critical system error occurred.
code, and fileName + errDesc as the output.
file without modification. If successful, the process should write the updated output file.
file without modification. If successful, the process should write the updated output file to outputFileName.
// From now on, I am getting the input file and output file names in runtime as below.]
from deepspeed-mii.
Related Issues (20)
- RuntimeError: The server socket has failed to listen on any local network address HOT 1
- Only running one replica even though setting many replicas
- [Problem]errno: 98 - Address already in use
- Performance with vllm HOT 1
- error when using Qwen1.5-32B
- ValueError: Unsupported model type phi3 HOT 1
- BUG in run_batch_processing
- Cannot run Yi-34B-Chat => ValueError: Unsupported q_ratio: 7 HOT 3
- [REQUEST] Mixtral-8x22B support
- [REQUEST] LLAMA-3 support
- Does deepspeed-mii support prefix_allowed_tokens_fn?
- DeepSpeed-MII 能加载量化的int4或者int8的模型吗?
- Tf32 support
- How can I use the same prompt to produce the same text output as vllm
- Support LLava next stronger
- support Qwen
- support Qwen1.5
- support stream
- [BUG] MII Backend Hangs After 9999 Exceptions in `MIIAsyncPipeline.put_request` HOT 2
- few questions regarding the implementation of streaming and batching
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from deepspeed-mii.