GithubHelp home page GithubHelp logo

webclinic017 / alpa Goto Github PK

View Code? Open in Web Editor NEW

This project forked from alpa-projects/alpa

0.0 0.0 0.0 5.37 MB

Auto parallelization for large-scale neural networks

Home Page: https://alpa.ai

License: Apache License 2.0

Shell 0.58% C++ 0.04% Python 94.57% Cuda 0.03% CMake 0.03% Jupyter Notebook 4.26% Dockerfile 0.27% Starlark 0.22%

alpa's Introduction

Alpa

Documentation | Slack

CI Build Jaxlib

Alpa is a system for training and serving large-scale neural networks.

Scaling neural networks to hundreds of billions of parameters has enabled dramatic breakthroughs such as GPT-3, but training and serving these large-scale neural networks require complicated distributed system techniques. Alpa aims to automate large-scale distributed training and serving with just a few lines of code.

The key features of Alpa include:

๐Ÿ’ป Automatic Parallelization. Alpa automatically parallelizes users' single-device code on distributed clusters with data, operator, and pipeline parallelism.

๐Ÿš€ Excellent Performance. Alpa achieves linear scaling on training models with billions of parameters on distributed clusters.

โœจ Tight Integration with Machine Learning Ecosystem. Alpa is backed by open-source, high-performance, and production-ready libraries such as Jax, XLA, and Ray.

๐ŸŒŸTry Alpa-served OPT-175B!

Alpa provides a free, unlimited OPT-175B text generation service. Try our service at https://opt.alpa.ai/ and share your prompting results!

Join Alpa slack and let us know any new features you want!

Quick Start

Use Alpa's decorator @parallelize to scale your single-device training code to distributed clusters.

import alpa

# Parallelize the training step in Jax by simply using a decorator
@alpa.parallelize
def train_step(model_state, batch):
    def loss_func(params):
        out = model_state.forward(params, batch["x"])
        return jnp.mean((out - batch["y"]) ** 2)

    grads = grad(loss_func)(model_state.params)
    new_model_state = model_state.apply_gradient(grads)
    return new_model_state

# The training loop now automatically runs on your designated cluster
model_state = create_train_state()
for batch in data_loader:
    model_state = train_step(model_state, batch)

Check out the Alpa Documentation site for installation instructions, tutorials, examples, and more.

Installation

The quickest way to get started with Alpa is via pip. We push the latest Alpa wheels to the PyPI index and the Alpa-modified jaxlib to our GitHub-hosted index.

# Install alpa
pip install alpa

# install alpa-jaxlib compatible with CUDA >= 11.1 and cuDNN >= 8.0.5,
pip install jaxlib==0.3.5+cuda111.cudnn805 -f https://alpa-projects.github.io/wheels.html

# You can install for other CUDA versions via:
pip install jaxlib==0.3.5+cuda{cuda_version}.cudnn{cudnn_version} -f https://alpa-projects.github.io/wheels.html

All supported CUDA and cuDNN versions are listed on the index page.

After installation, validate the installation works as expected:

# Start the ray cluster first
ray start --head
python -m alpa.test_install

You can also install Alpa from source for development or other CUDA versions. Follow this detailed guide to install Alpa from source or troubleshooting if you meet any errors during the process.

Learning more

Getting Involved

License

Alpa is licensed under the Apache-2.0 license.

alpa's People

Contributors

merrymercy avatar zhisbug avatar zyhowell avatar zhuohan123 avatar pkuflyingpig avatar jiahaoyao avatar comaniac avatar crazyboycjr avatar reinaw1012 avatar tarzanzhao avatar aurickq avatar lebrice avatar jiaodong avatar ludgerpaehler avatar richardscottoz avatar suquark avatar vinlnx avatar wgimperial avatar yf225 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.