Machine Learning with TensorFlow on Google Cloud Platform (GCP)

Ilias Papachristos
DataDrivenInvestor

--

This article is about my trip with two of the Specialization courses that Google Cloud Platform (GCP) has on the e-learning platform of Coursera.

But first, let me tell you how I found them. At the beginning of 2018, I read an article by Lak Lakshmanan. He wrote about the 10 lessons he made with his team. They were divided into two specialization programs, with 5 lessons each. At that time, I didn’t have the time to study, so I put it in the file “ToRead”.

At some point, I attended a Google webinar from the Community of Coursera and there I “won” a free month for a single specialization program.

So I signed in Machine Learning with TensorFlow on Google Cloud Platform. In general, here you will learn what is Machine learning and through the qwiklabs, you’ll do your practice, you’ll finish your assignments.

End-TO-End Recommendation System

The 1st lesson, How Google Does ML, has a general update for the specialization program, it continues with the AI strategy that Google is following, then pass on how Google does the ML over the years, in the penultimate lesson talk about what to avoid and/or what you have to consider when you want to use ML and at the end you learn the use of their own Python notebook.

The Launching into ML is a historical retrospective of ML, an update on why Neural networks are doing so well now in a large variety of Data Science problems.

The goals they have when you finish this course are:
• To know why Deep Learning is so popular,
• Optimize and evaluate models using loss modes and performance measurements,
• How to deal with the common problems you’ll meet in ML &
•How to create repetitive and scalable training, evaluation, and test datasets.

Logical continuity is the Introduction to TensorFlow. In this specialization program but also in the next one, the workshops that you’ll work is going to be with TensorFlow 1.7 or 1.8. The lesson starts with how to use TensorFlow at a low level (the highest is the use of tf.estimator) and we work through the necessary concepts and APIs so that we can write distributed ML models. They explain how to scale the training of this model and how to make high-performance predictions using the Cloud ML Engine.

We must improve the ML model we’ve built, improve its accuracy, find out which columns have the characteristics that are useful to us. In Feature Engineering We learn how to preprocess and transform them for optimum use in our ML model.

The Art and Science of Machine Learning is the last lesson of the program. In this lesson, they are showing us the basic skills of ML, which are good judgment and experimentation to coordinate and optimize our ML model to have the best performance. First, how to manually configure it to see the impact on model performance. And after with the hyperparameters, how can we coordinate it automatically using the cloud ML Machine in the Google Cloud Platform.

When I finished this Specialization program, I had a better knowledge of TensorFlow, and I have worked a lot with Google Cloud.

In November I attended Google’s Let’s Talk AI and got, again, a month free for a single Specialization Course.

So I started the Advanced ML with TensorFlow on GCP. recommend it to those who have moderate-advanced knowledge of ML. Otherwise, it is better to start from the Specialization that I write above.

The 1st Specialization was the… Warming up, it was the appetizer. Here the magic of the ML begins!

In this Specialization, I had my hands “dirty” using the Google Cloud Platform a lot. I obtained practical experience optimizing, developing and scaling the ML production models of various types in the labs. This Specialization taught me how to create custom, accurate and ready-to-produce models for Structured Data, Image Data, Time Series, NLP and Recommendation Systems.

Even before saying “good morning” we swim in the deep! In End-to-End ML with TensorFlow on GCP, I started by looking the steps to deploy the ML in a production environment, I continued with the exploration of the data using the Datalab and the BigQuery (really incredible!), with the use of Pandas in Datalab I made a dataset, I built my own model, preprocessing the data on a scale I made it functional and I trained it in the Cloud ML Engine!!!!!

ML Model

In the next lesson, Production Machine Learning Systems, I saw the components and best practices of a high-performance ML system in production environments. The architecture. High-level design decisions around the training and servicing of models that will be needed for the correct performance profile of the model. How to use data for cloud and ML-based analyses. How to transfer data to the cloud for the ML models. Why it’s best to have the data in the cloud, how do I get the benefits of the scale and use fully managed services. How to recognize the ways in which the model depends on the data, to make decisions about the cost, to know when to restore the models in previous versions, to identify the causes of the observed model behaviour. How to determine the performance parameters for ML models.

With The Image Understanding with TensorFlow on GCP, I learned the strategy that I needed to follow to make an image classification with CNN.

I started with Linear, then with DNN and finished with the CNN model. I’ve used dropout, pooling, saw the data & image augmentation, the transfer learning, batch normalization and at the end the residual networks. Here there is a lab on TPU. It’s optional because you have to pay $5. But running the ResNet on TPU is worth it! In the end, I saw the AutoML.

In Sequence Models for Time Series and Natural Language Processing, I learned about the models of sequences, their applications and the processing of natural language.

• Predicting future values of a time series
• Free text sorting
• Troubleshoot time series and text problems with RNN
• The choice between RNN/LSTM and simpler models
• Training and reuse of inline words in text problems
That was some of what I learned here.

And that’s how I got to the best. The last lesson was the icing on the cake. With the Recommendation Systems with TensorFlow on GCP, I finally managed to build a Recommendation System from start to finish using many of the tools available on Google Cloud Platform. IAM & admin, Storage, BigData, Composer, Cloud Shell…

Dashboard

Positive
The whole trip was great. From the beginning of the 1st specialization program, with terminology and historical data up to the end of the 2nd program.

Negative
“This is not a negative of the course, but mine!”
There is one more optional project. I didn’t dare to catch it. Because it would be better to have a TPU. And I don’t.

--

--

Full-Time Family Man, Retired Military Helicopter Pilot, Kendo Instructor, Google Cloud Champion Innovator AI/ML, Lead GDG Cloud Thessaloniki, WTM Ambassador