A look at the free tiers on these platforms and how they compare.

Diagram by author

A good chunk of my free time is spent tinkering, designing, and implementing systems that some would likely describe as “overengineered”. Do I really need a distributed Pub/Sub platform and a Data Lake for a home automation platform with 3 devices producing less than a kilobyte of data daily that gets consumed by a single web application? Probably not… but how else am I going to find the opportunity to figure out how these types of cloud services integrate and complement each other?

This post is not about convincing you to adapt my approach to your personal hobbies but instead…


A guide to unlocking data streaming for Python applications with the Google Cloud Platform

Photo by David Clode on Unsplash

Python is a popular language for all sorts of data processing today. Use cases range from web applications and machine learning applications all the way to hardware control on devices like the RaspberryPi. When it comes to these even systems and real-time data processing, leveraging Pub/Sub platforms can add modularity and scalability to your solutions — you can read more about this here.

Read about why I used Google Cloud Platform tools for my hobby projects here.

Objectives

In this article, I will walk through setting up a Python application to publish and consume data from Google’s Pub/Sub.

Time required: 15…


A walkthrough of deploying a simple node.js application on Google Cloud Platform’s App Engine.

Photo by Emile Perron on Unsplash

Google Cloud’s App Engine allows you to deploy scalable web applications on a platform fully managed by Google. These applications can range from back-end services and API layers to front-end applications running on Angular and React frameworks. Google provides 28 daily hours of run time with this service for free so you can get away with some free hosting!

In this article, I will walk you through the deployment of a simple Node.js application on this App Engine platform.

Prerequisites

To follow along, you should have the following:

  1. A basic understanding of how Node.js works
  2. Node installed on your local machine


A how-to guide for deploying your very own Ghost blog using Google Cloud Platform’s Compute Engine for free.

Photo by Intricate Explorer on Unsplash

Ghost is a popular open-source blogging platform of which I am a huge advocate. Here is what a blog hosted on Ghost looks like. The platform is secure, lightweight, and very easy to use and customize. The Ghost team has a post on how their platform compares to WordPress here.

There is a managed hosting fee if you want Ghost to host the blog for you but in this article, I will walk you through setting up a Dockerized ghost blog on the Google Cloud Platform (GCP). With this approach, your blog can be hosted absolutely free and the setup…


Understanding auto-generated GCS buckets, charge origins, and how to remove them.

Photo by Pedro da Silva on Unsplash

I enjoy using the Google Cloud Platform (GCP) for hobby projects (check out why I use GCP here) and Google’s Cloud Storage (GCS) product has made its way into my design several times. However, I quickly realized that other GCP services leverage GCS, creating buckets and filling them with objects.

At first, my use of GCS was light, and these system buckets didn’t bother me, but then I started seeing charges on my bill (albeit just a few cents), and decided it was time to understand what these buckets were for and how I could remove or reduce the charges.


A high-level overview of the Pub/Sub pattern and why you should be incorporating it into your projects

The Pub/Sub pattern is not something new but with the growing complexity of event systems combined with advances in distributed computing Pub/Sub is growing in popularity. In this article, I will explain the high-level Pub/Sub pattern and try to give you some reasons to include in your projects of varying sizes.

Without Pub/Sub

When building complex event-driven systems that require two or more different components to communicate with each other the traditional and simplest approach is to wire those components directly together. In some cases, this is done using web service APIs, flat-file exchange, or through shared data stores like databases. …


A walkthrough of how serverless batch jobs can be set up in the GCP platform using the Cloud Scheduler, Pub/Sub, and Cloud Functions.

Photo by Lukas Blazek on Unsplash

While designing and implementing solutions, I am often faced with the need to set up recurring batch jobs around data storage and processing. Recently I have been trying to keep my infrastructure as serverless as possible so in this article, I will show you how Google Cloud Platform can be leveraged to run almost any batch job your project might need for free.

Use Cases

For me, this batch pattern is the most useful when it comes to data processing, reconciliation, and cleanup. Here is an example involving data aggregation…

A bucket can be an effective repository for streaming data but if…


Part 4: Angular + Docker front end implementation to utilize the TensorFlow generated MNIST classification models.

In this post, I will be going through the process of setting up an Angular front end to connect and utilize some of the TensorFlow models that were set up in previous posts. The model set up and training walkthrough can be found here and the docker serving walkthrough here. This post is part of the TensorFlow + Docker MNIST Classifier series.

If you are not familiar with Angular I highly recommend at least going through the official getting started tutorial before implementing any of the code below. Or you can use your own front end instead of Angular.

This…


Part 3: Setting up Google’s TensorFlow serving application and hosting multiple models.

This post will be covering the process of setting up TensorFlow serving and exposing the two models that were build and trained in the previous post. TensorFlow serving is a system for managing machine learning models and exposing them to consumers via a standardized API. This post is part of the TensorFlow + Docker MNIST Classifier series.

If you are not familiar with docker I highly recommend going through the official getting started tutorial before implementing any of the code below.

For all of my API testing I will be using the postman application. …


Part 2: TensorFlow neural network implementation and training for classifying MNIST handwritten images.

This post will be covering the two models that were set up in TensorFlow to process MNIST digit data, how training was conducted, and finally how the results were converted into a tangible model to be leveraged downstream. This post is part of the TensorFlow + Docker MNIST Classifier series.

I will not be covering the basics of TensorFlow in these posts. Typically I am not a huge fan of programming literature myself with the massive amount of resources available online, however for learning TensorFlow I highly recommend this e-book for grasping the fundamentals.

The Data Set (MNIST): This is…

Andrew Didinchuk

Serial tinkerer and digital architect

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store