Comments (4)
Hi @volkerha this is not currently possible in MII. In general, any extra kwargs passed to query()
will be passed on to the transformers.pipeline
object. For example:
pipeline_kwargs = {"max_new_tokens": 50, "batch_size": 2}
result = generator.query({'query': ["DeepSpeed is the", "Seattle is"]}, **pipeline_kwargs)
The problem is that our current deployment types utilize a grpc server and those pipeline_kwargs
values must be serialized. Currently we only support values that are int
, float
, str
, and bool
. I see a few ways forward:
- We introduce a deployment type that doesn't rely on grpc (this is something we have on our TODO list currently)
- We extend serialization capabilities (this would be great because even with deployments that utilize grpc, it would fix this issue)
- We add a custom API that will generate the stopping criteria based on user input and pass that to the pipeline (this would probably be the most fragile and difficult to maintain option)
In your current usage of MII, do you require the GRPC server capabilities? or would a non-GRPC deployment work for you? I believe (1) will be the most likely to be implemented in the near future.
from deepspeed-mii.
Looks like there is a stop_sequence
(though limited to one token). huggingface/transformers#18444
I'm thinking to add a PR to hugging face to add an argument to the generation pipeline to specify stop tokens (so we don't have to pass objects around).
from deepspeed-mii.
As you mentioned, 🤗transformers simply set the eos_token_id
to the first token of the stop_sequence
here. We can directly pass eos_token_id
to generate(...)
, so there's not much benefit.
In my case, I would like to specify a sequence of several tokens as stopping criteria, e.g. only if the model generates something like "User:", generation should be stopped.
Also, it would be good to keep the original eos_token
.
from deepspeed-mii.
This PR is merged: huggingface/transformers#20727
Once released, we should be able to do model.generate(..., eos_token_id=[1, 2])
.
It's not using stopping_criteria
since it looks like jax and tf code doesn't use them, also beam search doesn't work with stopping_criteria
.
from deepspeed-mii.
Related Issues (20)
- Deployment in kubernetes
- Reproduced readme results HOT 8
- Can MII support quanted Llama2 of AWQ?
- Problem while running facebook/opt-125m with MII HOT 2
- When running mii.serv, it keeps print waiting for server to start. HOT 6
- restful_api_host did not use in anywhere HOT 1
- one of mii.client() Options, ignore_eos doesn't work
- for loop calling Non Persistent Pipeline will cause Deadlock HOT 1
- I wonder if we can use batch inference and offload in mii pipeline ? HOT 2
- How to get the logit tensor of generated text? HOT 4
- pydantic.errors.PydanticUserError HOT 1
- Mistral 8*7B Out of memory HOT 1
- deep speed parallel erro HOT 1
- Error: "Only able to place X replicas, but Y replicas were requested" HOT 1
- RuntimeError: The server socket has failed to listen on any local network address. The server socket has failed to bind to [::]:29700 (errno: 98 - Address already in use). HOT 1
- `ValueError: channels must be divisible by 8` when new special tokens are added HOT 3
- import mii not working HOT 5
- How to eliminate deadlock problem? HOT 1
- support for mixtral family ? HOT 9
- did "mii.pipeline" support float16? HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from deepspeed-mii.