Serving TensorFlow Models Using Docker to Classify MNIST Digits

Part 1: Project to deploy TensorFlow digit classifier models using docker and integrating them with an Angular front end.

Andrew Didinchuk
3 min readJan 19, 2021

Table of contents

  1. Introduction
  2. The Models : git repo → tf-mnist-project
  3. Serving Models : git repo → tf-serving-mnist-project
  4. The User Interface : git repo → angular-mnist-project

Recently I implement an end to end solution to better understand how a TensorFlow model could be leveraged in a production environment. There are many great resources available online that walk you through setting up and training TensorFlow models and deploying them as services. However, during this short project, I ran into several issues (mostly due to my lack of experience and understanding) and wanted to document my general process and issue resolution here so others can potentially learn from my mistakes and lessons.

Although I undertook this project to learn about TensorFlow model serving I wanted to tackle an end to end challenge to ensure my understanding (and that my setup works).

Project scope

This project is broken down into 3 sections/posts:

  1. Build, train and save a set of TensorFlow models
  2. Set up a docker container to host the TensorFlow models
  3. Deploy a simple Angular user interface to consume the service exposing the TensorFlow models

Note that the second section is the one that deals with the actual hosting of TensorFlow models while the other 2 sections help set up a testing platform. In terms of a problem for my TensorFlow model to solve I chose the MNIST digit data set. This dataset is readily available online and typically yields very good results with even the most basic models. This project will involve a set of models to classify these digits, host the models, and then provide an interface for a user to draw digits for classification.

some sample images from the MNIST data set

High-level architecture

A single Virtual Machine is leveraged for this project and the docker containers will be used for the deployment of isolated components as illustrated below. For my implementation, an Ubuntu Google Cloud Platform Virtual Machine instance was leveraged but any VM can be used.

High-level system architecture

What will be covered in my post regarding this project

  1. A high-level overview of leveraged technology
  2. TensorFlow model structure, training, and hyperparameters I found to be successful
  3. The detailed process of updating TensorFlow code to generate servable models
  4. Configuration of the TensorFlow docker serving image to deploy multiple models
  5. A high-level overview of my Angular app deployment and integration with the TensorFlow service
  6. A high-level overview of deploying both the Angular and TensorFlow serving containers on a Google Cloud Platform instance

What will not be covered in my post regarding this project

  1. TensorFlow and Angular code walkthrough — I will provide links to resources that do this much better than I and I will also be providing mt source code
  2. Details walkthrough of setting up of hosted instance — Relevant links will be provided

Keep in mind the objective of this project is not to optimize accuracy. The model is finicky with off-center and large/small images as it does not take advantage of convolution. The objective is to deploy a complete end to end functional solution.

Here is a summary of the components involved in this project

  1. Introduction
  2. The Models : git repo → tf-mnist-project
  3. Serving Models : git repo → tf-serving-mnist-project
  4. The User Interface : git repo → angular-mnist-project

--

--

Andrew Didinchuk
Andrew Didinchuk

Written by Andrew Didinchuk

Serial tinkerer and digital architect

No responses yet