Get Alert Emails for Pipeline Metrics

Learn how to set up custom alerts in ADF and receive emails when the alerts are fired.

Alerting and monitoring metrics is considered a necessary practice in any development setting since it gives an outlook on how the product is performing. In the case of data pipelines, alerting can ensure development teams know as soon as something fails or is not running expectedly so it can be fixed with urgency without impacting any end-users. Similarly, regular monitoring can ensure that teams have exposure to how their services are running. In this lesson, we'll discuss alerts in ADF and create a custom email alert for pipeline failures.

Alerts and metrics in ADF

Azure Data Factory (ADF) provides a robust alerting mechanism that enables users to stay informed about the status of their data integration workflows. Alerts are notifications triggered by predefined conditions, such as pipeline run failures or delays. These alerts exist within the ADF architecture, in addition to those that we covered in the previous lesson and that are hosted by Azure Monitor.

Levels of severity for configuring alerts

When setting up alerts in ADF, users can assign different levels of severity to each alert rule. The severity levels, such as sev0, sev1, and so on, represent the importance and urgency of the alert. For instance, sev0 signifies the highest severity, indicating critical issues that require immediate attention, while lower severity levels denote less critical issues. By configuring alert rules with appropriate severity levels, users can effectively prioritize and respond to alerts based on their impact on data operations and business processes.

Implementing email alerts in ADF

In this section, we’ll cover the actions that users can trigger from within ADF when their alerts are fired. For example, if a pipeline continues to fail three times, a developer might want to know this as soon as possible to apply fixes. We’ll create email triggers so that whenever the alert condition is met, an email is sent to the defined user group.

Setting up a pipeline

Let’s start by creating a pipeline that we know will fail. This is to test our alert.

  1. For setting up the data factory, we’ll use the pipeline created in the incremental loads CDC lesson. Remove the tumbling window trigger from that pipeline.

  2. Remember that the source dataset in this pipeline is in the Azure Blob container with two binary files “cdc-test-txt1.rtf” and “cdc-test-txt2.rtf”. The pipeline performs a copy of the source files to the destination.

  3. In the sink settings of this pipeline, delete the connection to the destination folder included earlier.

  4. Create a new dataset for the sink, and select Azure Data Lake Storage Gen2 using the same linked service created for the Azure account.

  5. Save and publish the pipeline.

Note: We have not created an Azure Data Lake Storage Gen2 container in the storage account, yet. Therefore, this pipeline will fail. The idea here is to let the pipeline fail and see if we get an alert when it fails. Let's now create the alert to get an email.

Get hands-on with 1300+ tech skills courses.