GithubHelp home page GithubHelp logo

heekyungyoon / bentoml Goto Github PK

View Code? Open in Web Editor NEW

This project forked from bentoml/bentoml

0.0 0.0 0.0 64.62 MB

Build Production-Grade AI Applications

Home Page: https://bentoml.com

License: Apache License 2.0

Shell 2.48% JavaScript 0.07% C++ 0.08% Python 94.78% PHP 0.05% Java 0.12% Go 0.04% Rust 0.15% Kotlin 0.16% CSS 0.05% Swift 0.18% Makefile 0.13% Dockerfile 0.34% Starlark 0.84% Jinja 0.54%

bentoml's Introduction

bentoml

BentoML: The Unified AI Application Framework

pypi_status CI Twitter Community

BentoML is a framework for building reliable, scalable, and cost-efficient AI applications. It comes with everything you need for model serving, application packaging, and production deployment.

๐Ÿ‘‰ Join our Slack community!

Highlights

๐Ÿฑ Bento is the container for AI apps

  • Open standard and SDK for AI apps, pack your code, inference pipelines, model files, dependencies, and runtime configurations in a Bento.
  • Auto-generate API servers, supporting REST API, gRPC, and long-running inference jobs.
  • Auto-generate Docker container images.

๐Ÿ„ Freedom to build with any AI models

๐Ÿญ Simplify modern AI application architecture

๐Ÿš€ Deploy Anywhere

  • One-click deployment to โ˜๏ธ BentoCloud, the Serverless platform made for hosting and operating AI apps.
  • Scalable BentoML deployment with ๐Ÿฆ„๏ธ Yatai on Kubernetes.
  • Deploy auto-generated container images anywhere docker runs.

Documentation

๐Ÿ› ๏ธ What you can build with BentoML

Getting Started

Save or import models in BentoML local model store:

import bentoml
import transformers

pipe = transformers.pipeline("text-classification")

bentoml.transformers.save_model(
  "text-classification-pipe",
  pipe,
  signatures={
    "__call__": {"batchable": True}  # Enable dynamic batching for model
  }
)

View all models saved locally:

$ bentoml models list

Tag                                     Module                Size        Creation Time
text-classification-pipe:kn6mr3aubcufโ€ฆ  bentoml.transformers  256.35 MiB  2023-05-17 14:36:25

Define how your model runs in a service.py file:

import bentoml

model_runner = bentoml.models.get("text-classification-pipe").to_runner()

svc = bentoml.Service("text-classification-service", runners=[model_runner])

@svc.api(input=bentoml.io.Text(), output=bentoml.io.JSON())
async def classify(text: str) -> str:
    results = await model_runner.async_run([text])
    return results[0]

Now, run the API service locally:

bentoml serve service.py:svc

Sent a prediction request:

$ curl -X POST -H "Content-Type: text/plain" --data "BentoML is awesome" http://localhost:3000/classify

{"label":"POSITIVE","score":0.9129443168640137}%

Define how a Bento can be built for deployment, with bentofile.yaml:

service: 'service.py:svc'
name: text-classification-svc
include:
  - 'service.py'
python:
  packages:
  - torch>=2.0
  - transformers

Build a Bento and generate a docker image:

$ bentoml build
...
Successfully built Bento(tag="text-classification-svc:mc322vaubkuapuqj").
$ bentoml containerize text-classification-svc
Building OCI-compliant image for text-classification-svc:mc322vaubkuapuqj with docker
...
Successfully built Bento container for "text-classification-svc" with tag(s) "text-classification-svc:mc322vaubkuapuqj"
$ docker run -p 3000:3000 text-classification-svc:mc322vaubkuapuqj

For a more detailed user guide, check out the BentoML Tutorial.


Community

BentoML supports billions of model runs per day and is used by thousands of organizations around the globe.

Join our Community Slack ๐Ÿ’ฌ, where thousands of AI application developers contribute to the project and help each other.

To report a bug or suggest a feature request, use GitHub Issues.

Contributing

There are many ways to contribute to the project:

  • Report bugs and "Thumbs up" on issues that are relevant to you.
  • Investigate issues and review other developers' pull requests.
  • Contribute code or documentation to the project by submitting a GitHub pull request.
  • Check out the Contributing Guide and Development Guide to learn more
  • Share your feedback and discuss roadmap plans in the #bentoml-contributors channel here.

Thanks to all of our amazing contributors!


Usage Reporting

BentoML collects usage data that helps our team to improve the product. Only BentoML's internal API calls are being reported. We strip out as much potentially sensitive information as possible, and we will never collect user code, model data, model names, or stack traces. Here's the code for usage tracking. You can opt-out of usage tracking by the --do-not-track CLI option:

bentoml [command] --do-not-track

Or by setting environment variable BENTOML_DO_NOT_TRACK=True:

export BENTOML_DO_NOT_TRACK=True

License

Apache License 2.0

FOSSA Status

Citation

If you use BentoML in your research, please cite using the following [citation](./CITATION.cff:

@software{Yang_BentoML_The_framework,
author = {Yang, Chaoyu and Sheng, Sean and Pham, Aaron and  Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost},
license = {Apache-2.0},
title = {{BentoML: The framework for building reliable, scalable and cost-efficient AI application}},
url = {https://github.com/bentoml/bentoml}
}

bentoml's People

Contributors

1e0ng avatar aarnphm avatar akainth015 avatar bojiang avatar dependabot[bot] avatar dtemir avatar frostming avatar haivilo avatar jackyzha0 avatar jianshen92 avatar jiewpeng avatar jjmachan avatar judahrand avatar kimsoungryoul avatar korusuke avatar larme avatar lintingzhen avatar liusy182 avatar mayurnewase avatar mqk avatar parano avatar qu8n avatar sauyon avatar sherlock113 avatar ssheng avatar tashakim avatar telescopic avatar timliubentoml avatar yetone avatar yubozhao avatar

bentoml's Issues

bug: OSError: transformer mask-generation does not appear to have a file named preprocessor_config.json

Describe the bug

$ bentoml serve service:svc
2023-09-04T13:11:48+0900 [INFO] [cli] Environ for worker 0: set CUDA_VISIBLE_DEVICES to 0
2023-09-04 13:11:48,168 - bentoml._internal.runner.strategy - INFO - Environ for worker 0: set CUDA_VISIBLE_DEVICES to 0
2023-09-04T13:11:48+0900 [INFO] [cli] Environ for worker 1: set CUDA_VISIBLE_DEVICES to 1
2023-09-04 13:11:48,187 - bentoml._internal.runner.strategy - INFO - Environ for worker 1: set CUDA_VISIBLE_DEVICES to 1
2023-09-04T13:11:48+0900 [INFO] [cli] Environ for worker 2: set CUDA_VISIBLE_DEVICES to 2
2023-09-04 13:11:48,187 - bentoml._internal.runner.strategy - INFO - Environ for worker 2: set CUDA_VISIBLE_DEVICES to 2
2023-09-04T13:11:48+0900 [INFO] [cli] Environ for worker 3: set CUDA_VISIBLE_DEVICES to 3
2023-09-04 13:11:48,187 - bentoml._internal.runner.strategy - INFO - Environ for worker 3: set CUDA_VISIBLE_DEVICES to 3
2023-09-04T13:11:48+0900 [INFO] [cli] Prometheus metrics for HTTP BentoServer from "service:svc" can be accessed at http://localhost:3000/metrics.
2023-09-04 13:11:48,193 - bentoml.serve - INFO - Prometheus metrics for HTTP BentoServer from "service:svc" can be accessed at http://localhost:3000/metrics.
2023-09-04T13:11:50+0900 [INFO] [cli] Starting production HTTP BentoServer from "service:svc" listening on http://0.0.0.0:3000 (Press CTRL+C to quit)
2023-09-04 13:11:50,178 - bentoml.serve - INFO - Starting production HTTP BentoServer from "service:svc" listening on http://0.0.0.0:3000 (Press CTRL+C to quit)
2023-09-04T13:12:15+0900 [ERROR] [runner:mask-generation:3] An exception occurred while instantiating runner 'mask-generation', see details below:
2023-09-04 13:12:15,706 - bentoml._internal.runner.runner - ERROR - An exception occurred while instantiating runner 'mask-generation', see details below:
2023-09-04T13:12:15+0900 [ERROR] [runner:mask-generation:3] Traceback (most recent call last):
  File "/home/{user}/.local/lib/python3.10/site-packages/bentoml/_internal/runner/runner.py", line 307, in init_local
    self._set_handle(LocalRunnerRef)
  File "/home/{user}/.local/lib/python3.10/site-packages/bentoml/_internal/runner/runner.py", line 150, in _set_handle
    runner_handle = handle_class(self, *args, **kwargs)
  File "/home/{user}/.local/lib/python3.10/site-packages/bentoml/_internal/runner/runner_handle/local.py", line 27, in __init__
    self._runnable = runner.runnable_class(**runner.runnable_init_params)  # type: ignore
  File "/home/{user}/.local/lib/python3.10/site-packages/bentoml/_internal/frameworks/transformers.py", line 881, in __init__
    self.model = load_model(bento_model, **kwargs)
  File "/home/{user}/.local/lib/python3.10/site-packages/bentoml/_internal/frameworks/transformers.py", line 515, in load_model
    return transformers.pipeline(
  File "/home/{user}/.local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 929, in pipeline
    image_processor = AutoImageProcessor.from_pretrained(
  File "/home/{user}/.local/lib/python3.10/site-packages/transformers/models/auto/image_processing_auto.py", line 344, in from_pretrained
    config_dict, _ = ImageProcessingMixin.get_image_processor_dict(pretrained_model_name_or_path, **kwargs)
  File "/home/{user}/.local/lib/python3.10/site-packages/transformers/image_processing_utils.py", line 329, in get_image_processor_dict
    resolved_image_processor_file = cached_file(
  File "/home/{user}/.local/lib/python3.10/site-packages/transformers/utils/hub.py", line 399, in cached_file
    raise EnvironmentError(
OSError: /home/{user}/bentoml/models/mask-generation/a6tg6lsk3ggi76vn does not appear to have a file named preprocessor_config.json. Checkout 'https://huggingface.co//home/{user}/bentoml/models/mask-generation/a6tg6lsk3ggi76vn/None' for available files.

To reproduce

No response

Expected behavior

No response

Environment

Environment variable

BENTOML_DEBUG=''
BENTOML_QUIET=''
BENTOML_BUNDLE_LOCAL_BUILD=''
BENTOML_DO_NOT_TRACK=''
BENTOML_CONFIG=''
BENTOML_CONFIG_OPTIONS=''
BENTOML_PORT=''
BENTOML_HOST=''
BENTOML_API_WORKERS=''

System information

bentoml: 1.1.5
python: 3.10.11
platform: Linux-5.15.0-79-generic-x86_64-with-glibc2.31
uid_gid: 1111:1111

pip_packages
aiohttp==3.8.5
aiosignal==1.3.1
anyio==3.7.1
appdirs==1.4.4
argon2-cffi==21.3.0
argon2-cffi-bindings==21.2.0
arrow==1.2.3
asgiref==3.7.2
asttokens @ file:///opt/conda/conda-bld/asttokens_1646925590279/work
astunparse==1.6.3
async-lru==2.0.3
async-timeout==4.0.3
attrs==23.1.0
Babel==2.12.1
backcall @ file:///home/ktietz/src/ci/backcall_1611930011877/work
beautifulsoup4 @ file:///croot/beautifulsoup4-split_1681493039619/work
bentoml==1.1.5
bleach==6.0.0
boltons @ file:///croot/boltons_1677628692245/work
brotlipy==0.7.0
build==1.0.0
cattrs==23.1.2
certifi @ file:///croot/certifi_1683875369620/work/certifi
cffi @ file:///croot/cffi_1670423208954/work
chardet @ file:///home/builder/ci_310/chardet_1640804867535/work
charset-normalizer @ file:///tmp/build/80754af9/charset-normalizer_1630003229654/work
circus==0.18.0
click==8.1.7
click-option-group==0.5.6
cloudpickle==2.2.1
comm==0.1.3
conda==23.3.1
conda-build==3.24.0
conda-content-trust @ file:///tmp/abs_5952f1c8-355c-4855-ad2e-538535021ba5h26t22e5/croots/recipe/conda-content-trust_1658126371814/work
conda-package-handling @ file:///croot/conda-package-handling_1672865015732/work
conda_package_streaming @ file:///croot/conda-package-streaming_1670508151586/work
contextlib2==21.6.0
cryptography @ file:///croot/cryptography_1677533068310/work
debugpy==1.6.7
decorator @ file:///opt/conda/conda-bld/decorator_1643638310831/work
deepmerge==1.1.0
defusedxml==0.7.1
Deprecated==1.2.14
dnspython==2.3.0
exceptiongroup==1.1.1
executing @ file:///opt/conda/conda-bld/executing_1646925071911/work
expecttest==0.1.4
fastjsonschema==2.18.0
filelock @ file:///croot/filelock_1672387128942/work
fqdn==1.5.1
frozenlist==1.4.0
fs==2.4.16
fsspec==2023.9.0
glob2 @ file:///home/linux1/recipes/ci/glob2_1610991677669/work
gmpy2 @ file:///tmp/build/80754af9/gmpy2_1645455533097/work
h11==0.14.0
huggingface-hub==0.16.4
hypothesis==6.75.2
idna @ file:///croot/idna_1666125576474/work
importlib-metadata==6.0.1
inflection==0.5.1
ipykernel==6.25.0
ipython @ file:///croot/ipython_1680701871216/work
isoduration==20.11.0
jedi @ file:///tmp/build/80754af9/jedi_1644315229345/work
Jinja2 @ file:///croot/jinja2_1666908132255/work
json5==0.9.14
jsonpatch @ file:///tmp/build/80754af9/jsonpatch_1615747632069/work
jsonpointer==2.1
jsonschema==4.18.4
jsonschema-specifications==2023.7.1
jupyter-events==0.6.3
jupyter-lsp==2.2.0
jupyter_client==8.3.0
jupyter_core==5.3.1
jupyter_server==2.7.0
jupyter_server_terminals==0.4.4
jupyterlab==4.0.3
jupyterlab-pygments==0.2.2
jupyterlab_server==2.24.0
libarchive-c @ file:///tmp/build/80754af9/python-libarchive-c_1617780486945/work
markdown-it-py==3.0.0
MarkupSafe @ file:///opt/conda/conda-bld/markupsafe_1654597864307/work
matplotlib-inline @ file:///opt/conda/conda-bld/matplotlib-inline_1662014470464/work
mdurl==0.1.2
mistune==3.0.1
mkl-fft==1.3.6
mkl-random @ file:///work/mkl/mkl_random_1682950433854/work
mkl-service==2.4.0
mpmath==1.3.0
multidict==6.0.4
nbclient==0.8.0
nbconvert==7.7.3
nbformat==5.9.1
nest-asyncio==1.5.6
networkx==3.1
notebook_shim==0.2.3
numpy @ file:///work/mkl/numpy_and_numpy_base_1682953417311/work
opentelemetry-api==1.18.0
opentelemetry-instrumentation==0.39b0
opentelemetry-instrumentation-aiohttp-client==0.39b0
opentelemetry-instrumentation-asgi==0.39b0
opentelemetry-sdk==1.18.0
opentelemetry-semantic-conventions==0.39b0
opentelemetry-util-http==0.39b0
overrides==7.3.1
packaging @ file:///croot/packaging_1678965309396/work
pandocfilters==1.5.0
parso @ file:///opt/conda/conda-bld/parso_1641458642106/work
pathspec==0.11.2
pexpect @ file:///tmp/build/80754af9/pexpect_1605563209008/work
pickleshare @ file:///tmp/build/80754af9/pickleshare_1606932040724/work
Pillow==9.4.0
pip-requirements-parser==32.0.1
pip-tools==7.3.0
pkginfo @ file:///croot/pkginfo_1679431160147/work
platformdirs==3.9.1
pluggy @ file:///tmp/build/80754af9/pluggy_1648024709248/work
prometheus-client==0.17.1
prompt-toolkit @ file:///croot/prompt-toolkit_1672387306916/work
psutil @ file:///opt/conda/conda-bld/psutil_1656431268089/work
ptyprocess @ file:///tmp/build/80754af9/ptyprocess_1609355006118/work/dist/ptyprocess-0.7.0-py2.py3-none-any.whl
pure-eval @ file:///opt/conda/conda-bld/pure_eval_1646925070566/work
pycosat @ file:///croot/pycosat_1666805502580/work
pycparser @ file:///tmp/build/80754af9/pycparser_1636541352034/work
Pygments @ file:///croot/pygments_1683671804183/work
pynvml==11.5.0
pyOpenSSL @ file:///croot/pyopenssl_1677607685877/work
pyparsing==3.1.1
pyproject_hooks==1.0.0
PySocks @ file:///home/builder/ci_310/pysocks_1640793678128/work
python-dateutil==2.8.2
python-etcd==0.4.5
python-json-logger==2.0.7
python-multipart==0.0.6
pytz @ file:///croot/pytz_1671697431263/work
PyYAML @ file:///croot/pyyaml_1670514731622/work
pyzmq==25.1.0
referencing==0.30.0
regex==2023.8.8
requests @ file:///croot/requests_1682607517574/work
rfc3339-validator==0.1.4
rfc3986-validator==0.1.1
rich==13.5.2
rpds-py==0.9.2
ruamel.yaml @ file:///croot/ruamel.yaml_1666304550667/work
ruamel.yaml.clib @ file:///croot/ruamel.yaml.clib_1666302247304/work
safetensors==0.3.3
schema==0.7.5
Send2Trash==1.8.2
simple-di==0.1.5
six @ file:///tmp/build/80754af9/six_1644875935023/work
sniffio==1.3.0
sortedcontainers==2.4.0
soupsieve @ file:///croot/soupsieve_1680518478486/work
stack-data @ file:///opt/conda/conda-bld/stack_data_1646927590127/work
starlette==0.31.1
sympy==1.12
terminado==0.17.1
tinycss2==1.2.1
tokenizers==0.13.3
tomli @ file:///opt/conda/conda-bld/tomli_1657175507142/work
toolz @ file:///croot/toolz_1667464077321/work
torch==2.0.1
torchaudio==2.0.2
torchdata @ file:///__w/_temp/conda_build_env/conda-bld/torchdata_1682362130135/work
torchelastic==0.2.2
torchtext==0.15.2
torchvision==0.15.2
tornado==6.3.2
tqdm @ file:///croot/tqdm_1679561862951/work
traitlets @ file:///croot/traitlets_1671143879854/work
transformers==4.32.1
triton==2.0.0
types-dataclasses==0.6.6
typing_extensions @ file:///croot/typing_extensions_1681939499988/work
uri-template==1.3.0
urllib3 @ file:///croot/urllib3_1680254681959/work
uvicorn==0.23.2
watchfiles==0.20.0
wcwidth @ file:///Users/ktietz/demo/mc3/conda-bld/wcwidth_1629357192024/work
webcolors==1.13
webencodings==0.5.1
websocket-client==1.6.1
wrapt==1.15.0
yarl==1.9.2
zipp==3.16.2
zstandard @ file:///croot/zstandard_1677013143055/work

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.