Horje
How to Send Failure & Success Pipeline Email via Logic Apps

In modern data workflows, it’s crucial to have real-time notifications for the status of data pipelines. Azure Data Factory (ADF) is a powerful tool for orchestrating data processes. Azure Logic Apps can be seamlessly integrated with ADF to send email notifications when pipelines start, succeed, or fail. This guide will walk you through setting up this integration.

Prerequisites

  • Azure Subscription: Ensure you have an Azure subscription.
  • Azure Data Factory: Have an existing pipeline in Azure Data Factory.
  • Azure Data Lake Storage: Have two containers (zone1 and zone2) in your Data Lake
  • Azure Logic Apps: Set up an Azure Logic App to handle HTTP requests and send email notifications.
  • Email Account: An email account to send notifications (e.g., Outlook, Office 365, Gmail).

Scenario

To show it as a demo I have created an Azure Data Lake with two containers named zone1 and zone2. Our pipeline copies data from one container to another within the Data Lake. We want to set up email notifications to inform us whenever the pipeline starts, succeeds, or fails.

Azure DataLake Storage Gen2

I have created two containers in my datalake storage and named them zone1 and zone2. I have uploaded a sample CSV data to zone1 and nothing in zone2. We copy this CSV dataset to zone2 through Azure Datafactory. Please Refer to the following screenshots.

Containers in Data Lake

Azure Logic Apps

Azure Logic Apps works by connecting different services (like email, social media, file storage, etc.) together in a sequence of steps called a workflow. Each workflow starts with a trigger, which is an event that sets the workflow in motion (e.g., a new file uploaded, a timer expired, etc.). Once triggered, the Logic App executes actions based on conditions and instructions defined by you.In this tutorial we trigger the workflow by Http post method available in Azure logic apps.Some examples where we use logic apps are

  • Move uploaded files from an SFTP or FTP server to Azure Storage.
  • Schedule and send email notifications using Office 365 when a specific event happens, for example, a new file is uploaded.
  • Monitor tweets, analyze the sentiment, and create alerts or tasks for items that need review.

Overview

Before proceeding further I will explain you the overview of what we will be doing in this article.

We have a pipeline in Azure Datafactory with ” Copy Data ” activity which copies data from zone1 in datalake to zone2 . Whenever this pipeline execution is completed , we will send the execution details of pipeline (Success or Fail) via a “HTTP POST REQUEST” to the our logic app.Whenever the logic app is triggered via “HTTP POST REQUEST” It will send a email about the status of pipeline based on the details received via post request. You should always remember that webservices communicate via JSON format text.We will use the same here.

Step 1 : Creating Azure Logic App

Create a Azure logic app from azure dashboard , go to search bar type logic app and click on create.You will be prompted with below page.Give any logic app name you want and just click on “Review and Create” your logic app will be created.

Creating a Logic App

Step 2:Design the Logic App

After creating your Logic App, open it in the Azure portal and go to the Logic App designer. Select “Blank Logic App” to start with an empty template.Please refer to the below screenshot.

Creating a Logic app designer

Step 3: Add a Trigger

To begin, add the trigger “When a HTTP request is received.” This trigger allows your Azure Data Factory to send a signal to the Logic App whenever a specific event happens. Define the JSON schema that the Azure Data Factory will send, including details like pipeline name, status, and run ID. This trigger will start the activity “send a email” which we will add in the next step.

Adding a Trigger

Once you have added the HTTP trigger ,click on the trigger and select method as POST and in the request body JSON schema you should specify the attributes that you will be receiving from datafactory HTTP POST request . We need name of pipeline , datafacory name ,run id , execution date , execution status , status logs . We should specify these attributes in JSON object format . JSON code is giving below

{
    "type": "object",
    "properties": {
        "PipelineName": {
            "type": "string"
        },
        "DataFactoryName": {
            "type": "string"
        },
        "Run ID": {
            "type": "string"
        },
        "status": {
            "type": "string"
        },
        "Logs": {
            "type": "string"
        }
    }
}

Please Refer to below screenshot for more details

Note : In Parameters you will see the HTTP URL this will be generated after saving the whole logic app.

Post

Step 4:Adding Send Email action

After adding the trigger we need add some actions to be executed when that trigger is triggered.In our case the action is send a email .So click on the plus button the trigger button and you will be prompted with add an action popup , type send a email and you will be given a lot options , like Gmail , OutLook , Yahoo etc , select any one according to your convenience . I have selected a google email account.

Adding actions

Once you have added send email action ,you need to sign into any one of the account of the domain you have selected. Give any connection name you want select Authentication type as Shared application and click on sign in.

Note: This email which you are signing in now will be used to send the mail to other applicants . Please refer to the below screenshot

Adding a Gmail Account

Step 5 : Writing Email Alert Body

Once you have added the email action button , we need to add the recipients who want to receive the alerts on the pipeline and also need add the body which contained the entire details like Datafactory name , pipeline name , run_id , status , Logs .We can add the data into email subject and body by dynamically.Refer to below screenshot on how to add recipients.

Screenshot-2024-07-13-155529

Adding recipients

Now let us add the dynamic content to our mail that we ant to send , If you click on thunder symbol displaying on the left side pane , you will shown all the contents that we have received through HTTP POST Method.You can add the dynamic content in the body of mail also . Please refer to the Screens shots.

Adding Dynamic content

Once you click on the dynamic content button you can see all the details like Datafactory name , pipeline name , run_id , status , Logs add them in any format that suits you.Please refer to below screen shot.

Adding Dynamic content to the body

Step 6 : Saving the Logic App

Once you have made all the changes to the email subject and body , save the logic app and copy the HTTP URL generated for the future reference which we will be suing in the Azure Datafactory Pipeline .Please refer the below screenshot below

Note : This is Important Step you should complusary save the logic app to generate the URL

Saving Logic and Copying the HTTP URL

Step 7 : Azure Data Factory Stepup

Now you need to create the Datafoctory similar to Azure logic apps .In the Azure portal, click on “Create a resource” and search for “Data Factory.” Click “Create” and fill in the required details such as the Resource Group, Data Factory Name, and Region. Once completed, click “Create.”

Navigate to your newly created Data Factory. Open the “Author & Monitor” pane to create a new pipeline.

Add a Copy Data Activity: Drag a “Copy Data” activity onto the pipeline canvas. This activity will copy data from zone1 to zone2 within your Azure Data Lake.Please refer to below Screenshot.

Creating Azure Data Pipeline

Step 8 :Configure Source and Sink

Configure Source: In the “Source” tab of the Copy Data activity, set up the source dataset to point to the container in zone1. Define the format and other necessary details.

Configure Sink: In the “Sink” tab, set up the destination dataset to point to the container in zone2. Again, define the necessary format and configuration details.Please refer to the below screenshots.

Source Settings

Source settings

Sink Settings

Sink Settings

Step 9 : Adding Web activity

To send notifications for pipeline activities, you will need to add Web activities at different stages of your pipeline to send HTTP POST requests to your Logic App. This Web activity will the information of pipeline to a HTTP URL via POST METHOD which is hosted in a Logic App we have created earlier.We are adding a Web activity at the end of your pipeline to notify that the pipeline has failed or succeeded.Please Look in the below ScreenShots

  • Drag and Drop: Drag a Web activity onto the pipeline canvas and connect it to the start of your pipeline.
  • Configuration: Click on the Web activity and configure it to send a POST request to the Logic App URL.
Adding Web activity

Step 10 : Adding Web Activities for Success and Failure

Now we need to make some changes to the web activity in order to complete its setup.

  • We need to connect the activation tag to the webactivity by draging it.
  • Go to the settings of webactivity and paste the URL you have copied from the STEP 6 and choose method as POST.
  • In the Body tag we need to mention some data in JSON FORMAT which will be post to azure logic apps

Please refer to the below screen shots

Modifying Web Activies

If you observe in the JSON DOCUMENT we have wrote in logic app HTTP REQUEST trigger , we have variables like Data Factory Name ,Pipeline Name , Run_ID , Status and Logs.We we will be mentioning those variables here by clicking add dynamic content button mentioned in above step.Please refer to below screenshots.

Adding dynamic Content in webactivity

Please note that pipeline() is inbuilt parameter in datafactory which will be available for every pipeline , contains all information about the pipeline.Please refer to below code.

{
    "PipelineName" :    "@{pipeline().Pipeline}",
    "DataFactoryName" : "@{pipeline().DataFactory}",
    "Run ID":           "@{pipeline().RunId}",
    "status":           "@{activity('Copy Activity').Status}",
"Logs":             "@{activity('Copy Activity').output.errors[0].Message}"
}

Step 11 : Executing the Pipeline

Now our entire set up is done.We will start executing the pipeline by clicking on the debug button and verify weather we are receiving alert mails are not.Please look into below screenshots.

Executing the pipeline

We ran our pipeline at 10:24 and upon success on pipeline we should receive a email that our pipeline is succeeded .Now If you check your email account which your mentioned in recievers list you should find a mail with success subject and body.

success mail alert

Now you see the Zone2 of Azure datalake storage we should see the data which is copied from zone1 via pipeline execution.we will delete the source dataset in zone1 to make our pipeline fail so that we can check for failure case also.Please refer to below screenshot

To execute Failure Case

Executing Pipeline To check Failure Case

Let us now execute the pipeline again see how it works in failure case. Please refer to below case.

Pipeline Failure Status

Now our pipeline has failed , we should have received a email alert as of now.

Email alert for failure case

As you can we have received a email alert for failure along with log messages stating that our source fail is missing.

Conclusion

In this guide, you set up a system that sends email notifications for your Azure Data Factory pipeline activities using Azure Logic Apps. This means you’ll get an email whenever your pipeline starts, finishes successfully, or fails, allowing you to keep a close eye on your data processes.By automating these notifications, you can quickly respond to any issues, ensuring your data tasks run smoothly.

This setup shows how you can use Azure’s tools to make your work easier and more efficient, helping you manage your data better. As you get more comfortable with these tools, you can explore more ways to automate and improve your workflows.

Send Failure & Success Pipeline Email via Logic Apps – FAQs

Can I send notifications to multiple email addresses?

Absolutely! Separate the recipient email addresses with semicolons in the To field of the “Office 365 Outlook” (or your email provider) action.

What are the prerequisites for setting up email notifications for Azure Data Factory (ADF) pipelines using Logic Apps?

  • Azure Subscription: You’ll need an active Azure subscription. If you don’t have one, you can create a free account.
  • ADF Pipeline: You should have an existing ADF pipeline that you want to monitor for success or failure events.
  • Logic Apps Designer: Familiarity with the Logic Apps designer interface will be beneficial.

How do I troubleshoot issues with email notifications?

  • Verify Logic App configuration: Double-check trigger




Reffered: https://www.geeksforgeeks.org


DevOps

Related
Docker Data Volume vs Mounted Host Directory Docker Data Volume vs Mounted Host Directory
Spring Boot Application Deployment in Kubernetes with Jenkins CI/CD Pipeline Spring Boot Application Deployment in Kubernetes with Jenkins CI/CD Pipeline
Google Cloud Artifact Registry Pricing Google Cloud Artifact Registry Pricing
Difference Between ARG and ENV in Docker Container Difference Between ARG and ENV in Docker Container
What is Prometheus Endpoint What is Prometheus Endpoint

Type:
Geek
Category:
Coding
Sub Category:
Tutorial
Uploaded by:
Admin
Views:
14