GCP Model Pipeline

Learn how you can build pipelines using GCP.

We'll cover the following

BigQuery and Spark

A common workflow for batch model pipelines is reading input data from a lake, applying a machine learning model, and then writing the results to an application database.

In GCP, BigQuery serves as the data lake and Cloud Datastore can serve as an application database. We’ll build an end-to-end pipeline with these components in the next chapter, but for now, we’ll get hands on with a subset of the GCP components directly in Spark.

Get hands-on with 1400+ tech skills courses.