Exporting Information

Learn how to export DataFrames from Spark into different sources.

In the Spark world, exporting information is simply the act of writing a DataFrame (or other abstractions) into a persistent DataSource, such as a database, or to a persistent storage abstraction, such as a file. In this lesson, we are going to learn how to do both.

There are many similarities between ingesting and exporting information, mostly in programmatic terms at the code level. This is no accident. The creators of the Spark API made the interfaces and methods somewhat similar by applying design patterns, thus providing the developer with an easy-to-remember API.

Exporting to a database

We can pick up where we left off in the previous project because we have a database, Embedded DerbyDB, already bootstrapped to the application during its startup.

However, let’s inspect some new classes added to the project shown below:

Get hands-on with 1400+ tech skills courses.