Counting 3,834 Big Data & Machine Learning Frameworks, Toolsets, and Examples...
Suggestion? Feedback? Tweet @stkim1

Author
Contributors
Last Commit
Mar. 8, 2019
Created
Aug. 15, 2017

Tensor Bridge

Tensor Bridge is an OpenAPI (Swagger) Specification as well as a simple Connexion wrapper for TensorFlow Serving.

The specification was obtained by compiling an annotated tensor_bridge.proto using grpc-gateway. The result is located in swagger/tensor_bridge.json.

Deprecation notice

TensorFlow now supports REST API natively as of version v1.8.0. Tensor Bridge API is still useful as the exact REST equivalent of the TF gRPC API. However, we deprecate this project and recommend using the official TF REST API for new developments.

How is this useful?

The publicly available version of TensorFlow serving works over gRPC.

Now you can use the API to build your own REST service and use JSON to talk to your TensorFlow models. A full example is included in this repo (see app.py and api/). If you prefer Go, you can even generate a reverse proxy automatically using grpc-gateway.

Installation

Simply run

docker build -t tf-bridge .

from the project root.

This will take a while as you are compiling TensorFlow and TensorFlow Serving from source. (Consider dedicating around 6-8 GB of RAM to Docker)

When the image is created you can start the servers

docker run -d -p 9001:9001 -p 9000:9000 -e MODEL=mnist tf-bridge

Tensor Bridge can be queried at 9001.

The gRPC endpoint is still available at 9000 for convenience and testing.

You will notice that we use the MODEL variable to specify the model. As an example, we included an exported MNIST model in this repo.

To see the Swagger UI go to http://localhost:9001/ui/

Client

There is also a simple client located in client/mnist_client.py for testing purposes. Make sure to install the necessary dependencies from requirements.txt.

If everything went well, you will shortly get the following output

Inference error rate: 10.4%