Serving TensorFlow Models Using Docker to Classify MNIST Digits
Part 1: Project to deploy TensorFlow digit classifier models using docker and integrating them with an Angular front end.
Table of contents
- The Models : git repo → tf-mnist-project
- Serving Models : git repo → tf-serving-mnist-project
- The User Interface : git repo → angular-mnist-project
Recently I implement an end to end solution to better understand how a TensorFlow model could be leveraged in a production environment. There are many great resources available online that walk you through setting up and training TensorFlow models and deploying them as services. However, during this short project, I ran into several issues (mostly due to my lack of experience and understanding) and wanted to document my general process and issue resolution here so others can potentially learn from my mistakes and lessons.
Although I undertook this project to learn about TensorFlow model serving I wanted to tackle an end to end challenge to ensure my understanding (and that my setup works).
This project is broken down into 3 sections/posts:
- Build, train and save a set of TensorFlow models
- Set up a docker container to host the TensorFlow models
- Deploy a simple Angular user interface to consume the service exposing the TensorFlow models
Note that the second section is the one that deals with the actual hosting of TensorFlow models while the other 2 sections help set up a testing platform. In terms of a problem for my TensorFlow model to solve I chose the MNIST digit data set. This dataset is readily available online and typically yields very good results with even the most basic models. This project will involve a set of models to classify these digits, host the models, and then provide an interface for a user to draw digits for classification.
A single Virtual Machine is leveraged for this project and the docker containers will be used for the deployment of isolated components as illustrated below. For my implementation, an Ubuntu Google Cloud Platform Virtual Machine instance was leveraged but any VM can be used.
What will be covered in my post regarding this project
- A high-level overview of leveraged technology
- TensorFlow model structure, training, and hyperparameters I found to be successful
- The detailed process of updating TensorFlow code to generate servable models
- Configuration of the TensorFlow docker serving image to deploy multiple models
- A high-level overview of my Angular app deployment and integration with the TensorFlow service
- A high-level overview of deploying both the Angular and TensorFlow serving containers on a Google Cloud Platform instance
What will not be covered in my post regarding this project
- TensorFlow and Angular code walkthrough — I will provide links to resources that do this much better than I and I will also be providing mt source code
- Details walkthrough of setting up of hosted instance — Relevant links will be provided
Keep in mind the objective of this project is not to optimize accuracy. The model is finicky with off-center and large/small images as it does not take advantage of convolution. The objective is to deploy a complete end to end functional solution.
Here is a summary of the components involved in this project