TensorFlowOnSpark brings scalable deep learning to Apache Hadoop and Apache Spark clusters.
It enables both distributed TensorFlow training and inferencing on Spark clusters, with a goal to minimize the amount of code changes required to run existing TensorFlow programs on a shared grid. Its Spark-compatible API helps manage the TensorFlow cluster with the following steps:
- Startup - launches the Tensorflow main function on the executors, along with listeners for data/control messages.
- Data ingestion
- Shutdown - shuts down the Tensorflow workers and PS nodes on the executors.
Table of Contents
TensorFlowOnSpark was developed by Yahoo for large-scale distributed deep learning on our Hadoop clusters in Yahoo's private cloud.
TensorFlowOnSpark provides some important benefits (see our blog) over alternative deep learning solutions.
- Easily migrate all existing TensorFlow programs with <10 lines of code change;
- Support all TensorFlow functionalities: synchronous/asynchronous training, model/data parallelism, inferencing and TensorBoard;
- Server-to-server direct communication achieves faster learning when available;
- Allow datasets on HDFS and other sources pushed by Spark or pulled by TensorFlow;
- Easily integrate with your existing data processing pipelines and machine learning algorithms (ex. MLlib, CaffeOnSpark);
- Easily deployed on cloud or on-premise: CPU & GPU, Ethernet and Infiniband.
TensorFlowOnSpark is provided as a pip package, which can be installed on single machines via:
pip install tensorflowonspark
For distributed clusters, please see our wiki site for detailed documentation for specific environments, such as our getting started guides for single-node Spark Standalone, YARN clusters and AWS EC2. Note: the Windows operating system is not currently supported due to this issue.
To use TensorFlowOnSpark with an existing TensorFlow application, you can follow our Conversion Guide to describe the required changes. Additionally, our wiki site has pointers to some presentations which provide an overview of the platform.
API Documentation is automatically generated from the code.
Contributions are always welcome. For more information, please see our guide for getting involved.
The use and distribution terms for this software are covered by the Apache 2.0 license. See LICENSE file for terms.