Creating an ADF Data Pipeline Using Azure CLI
Learn how to create an ADF data pipeline using Azure CLI with step-by-step instructions.
Azure CLI can easily create, configure, and manage pipelines within ADF. In this lesson, we'll walk through the step-by-step process of creating a data factory using Azure CLI and demonstrate how to configure the necessary settings to get a data pipeline up and running in no time.
Designing the data pipeline
Designing an efficient and scalable data pipeline in Azure Data Factory involves several steps. First, identify the source data and destination for extraction and loading. Determine the necessary data transformation activities, such as cleaning, enrichment, aggregation, and joining. Create a logical representation of the pipeline using ADF's visual interface or code-based authoring. Validate and test the pipeline to ensure it meets the needed business requirements. By following these steps, a reliable and effective data pipeline can be created within the selected resource group.
Step 1: Create an activity
Creating an activity is crucial for defining the actions to be performed within a data pipeline in ADF. Activities represent individual data processing tasks, such as data extraction, transformation, and loading. By specifying the type of activity and its associated properties, ADF understands what operations need to be executed on the data. Activities serve as building blocks for constructing the overall workflow and logic of the data pipeline, allowing for orchestrating complex data integration and transformation processes.
The JSON file shown below, is available in the files of the course as
dataFactorytemplate.json
and can directly be used in thepipeline create
command below.
Below is an example of creating a copy activity in ADF that will be used to move data from one storage location to another.
Get hands-on with 1300+ tech skills courses.