GithubHelp home page GithubHelp logo

young-eun-nam / tensorsparkml Goto Github PK

View Code? Open in Web Editor NEW

This project forked from kaist-dmlab/tensorsparkml

0.0 0.0 0.0 8.58 MB

TensorSparkML brings scalable in-memory machine learning to TensorFlow on Apache Spark clusters.

License: Apache License 2.0

Python 80.15% Shell 1.71% Scala 18.14%

tensorsparkml's Introduction

TensorSparkML

TensorSparkML brings scalable in-memory machine learning to TensorFlow on Apache Spark clusters.

Build Status PyPI version

By combining salient features from the TensorFlow deep learning framework with Apache Spark and Apache Hadoop, TensorFlowOnSpark enables distributed deep learning on a cluster of GPU and CPU servers.

It enables both distributed TensorFlow training and inferencing on Spark clusters, with a goal to minimize the amount of code changes required to run existing TensorFlow programs on a shared grid. Its Spark-compatible API helps manage the TensorFlow cluster with the following steps:

  1. Startup - launches the Tensorflow main function on the executors, along with listeners for data/control messages.
  2. Data ingestion
    • InputMode.TENSORFLOW - leverages TensorFlow's built-in APIs to read data files directly from HDFS.
    • InputMode.SPARK - sends Spark RDD data to the TensorFlow nodes via a TFNode.DataFeed class. Note that we leverage the Hadoop Input/Output Format to access TFRecords on HDFS.
  3. Shutdown - shuts down the Tensorflow workers and PS nodes on the executors.

Background

TensorSparkML provides some important benefits over alternative deep learning solutions.

  • Easily migrate existing TensorFlow programs with <10 lines of code change.
  • Support all TensorFlow functionalities: synchronous/asynchronous training, model/data parallelism, inferencing and TensorBoard.
  • Server-to-server direct communication achieves faster learning when available.
  • Allow datasets on HDFS and other sources pushed by Spark or pulled by TensorFlow.
  • Easily integrate with your existing Spark data processing pipelines.
  • Easily deployed on cloud or on-premise and on CPUs or GPUs.

For distributed clusters, please see our wiki site for detailed documentation for specific environments, such as our getting started guides for single-node Spark Standalone, YARN clusters and AWS EC2. Note: the Windows operating system is not currently supported due to this issue.

Usage

To use TensorSparkML with an existing TensorFlow application, you can follow our Conversion Guide to describe the required changes. Additionally, our wiki site has pointers to some presentations which provide an overview of the platform.

Note: since TensorFlow 2.x breaks API compatibility with TensorFlow 1.x, the examples have been updated accordingly. If you are using TensorFlow 1.x, you will need to checkout the v1.4.4 tag for compatible examples and instructions.

How to train mnist example code

pre-installed

pip install tensorflow
pip install tensorflow-datasets
pip install tensorflowonspark
pip install pyspark
wget -q https://www-us.apache.org/dist/spark/spark-3.0.1/spark-3.0.1-bin-hadoop2.7.tgz
tar xf spark-3.0.1-bin-hadoop2.7.tgz

training

Data setup: spark.InputMode

you can set SPARK_WORKER_INSTANCES, CORES_PER_WORKER,TFoS_HOME,SPARK_HOME.

you can change port number, using conf spark.ui.port option.

export SPARK_WORKER_INSTANCES=1
export CORES_PER_WORKER=1
export TOTAL_CORES=$((${CORES_PER_WORKER}*${SPARK_WORKER_INSTANCES}))
export TFoS_HOME=/path/TensorFlowOnSpark
export SPARK_HOME=/path/spark-3.0.1-bin-hadoop2.7


${SPARK_HOME}/sbin/start-master.sh;${SPARK_HOME}/sbin/start-slave.sh -c $CORES_PER_WORKER -m 3G ${MASTER}

# spark.InputMode
${SPARK_HOME}/bin/spark-submit \
--jars ${TFoS_HOME}/lib/tensorflow-hadoop-1.0-SNAPSHOT.jar \
--conf spark.ui.port=4050 \
${TFoS_HOME}/examples/mnist/mnist_data_setup.py \
--output ${TFoS_HOME}/data/mnist

#check the data in the dataset
ls -lR ${TFoS_HOME}/data/mnist/csv

# remove any old artifacts
rm -rf ${TFoS_HOME}/mnist_model
rm -rf ${TFoS_HOME}/mnist_export

Train

Caution: when you finish training data, you can't connect spark ui website.

# train
${SPARK_HOME}/bin/spark-submit \
--conf spark.cores.max=${TOTAL_CORES} \
--conf spark.task.cpus=${CORES_PER_WORKER} \
${TFoS_HOME}/examples/mnist/keras/mnist_spark.py \
--cluster_size ${SPARK_WORKER_INSTANCES} \
--images_labels ${TFoS_HOME}/data/mnist/csv/train \
--model_dir ${TFoS_HOME}/mnist_model \
--export_dir ${TFoS_HOME}/mnist_export

# confirm model
ls -lR ${TFoS_HOME}/mnist_model
ls -lR ${TFoS_HOME}/mnist_export

License

The use and distribution terms for this software are covered by the Apache 2.0 license. See LICENSE file for terms.

tensorsparkml's People

Contributors

leewyang avatar anfeng avatar eordentlich avatar junhyeokkang avatar elyast avatar winston-zillow avatar yileic avatar wangyum avatar anttisaukko avatar najeehye avatar junshi15 avatar prosopher avatar tmielika avatar dongmean avatar viirya avatar bbshetty avatar jaegil avatar ssheikholeslami avatar tobiajo avatar varunkmohan avatar dratini6 avatar zhe-thoughts avatar qsbao avatar viplav avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.