![]() |
In modern data workflows, it’s crucial to have real-time notifications for the status of data pipelines. Azure Data Factory (ADF) is a powerful tool for orchestrating data processes. Azure Logic Apps can be seamlessly integrated with ADF to send email notifications when pipelines start, succeed, or fail. This guide will walk you through setting up this integration. Prerequisites
ScenarioTo show it as a demo I have created an Azure Data Lake with two containers named zone1 and zone2. Our pipeline copies data from one container to another within the Data Lake. We want to set up email notifications to inform us whenever the pipeline starts, succeeds, or fails. Azure DataLake Storage Gen2I have created two containers in my datalake storage and named them zone1 and zone2. I have uploaded a sample CSV data to zone1 and nothing in zone2. We copy this CSV dataset to zone2 through Azure Datafactory. Please Refer to the following screenshots. ![]() Azure Logic AppsAzure Logic Apps works by connecting different services (like email, social media, file storage, etc.) together in a sequence of steps called a workflow. Each workflow starts with a trigger, which is an event that sets the workflow in motion (e.g., a new file uploaded, a timer expired, etc.). Once triggered, the Logic App executes actions based on conditions and instructions defined by you.In this tutorial we trigger the workflow by Http post method available in Azure logic apps.Some examples where we use logic apps are
OverviewBefore proceeding further I will explain you the overview of what we will be doing in this article. We have a pipeline in Azure Datafactory with ” Copy Data ” activity which copies data from zone1 in datalake to zone2 . Whenever this pipeline execution is completed , we will send the execution details of pipeline (Success or Fail) via a “HTTP POST REQUEST” to the our logic app.Whenever the logic app is triggered via “HTTP POST REQUEST” It will send a email about the status of pipeline based on the details received via post request. You should always remember that webservices communicate via JSON format text.We will use the same here. Step 1 : Creating Azure Logic AppCreate a Azure logic app from azure dashboard , go to search bar type logic app and click on create.You will be prompted with below page.Give any logic app name you want and just click on “Review and Create” your logic app will be created. ![]() Step 2:Design the Logic AppAfter creating your Logic App, open it in the Azure portal and go to the Logic App designer. Select “Blank Logic App” to start with an empty template.Please refer to the below screenshot. ![]() Step 3: Add a TriggerTo begin, add the trigger “When a HTTP request is received.” This trigger allows your Azure Data Factory to send a signal to the Logic App whenever a specific event happens. Define the JSON schema that the Azure Data Factory will send, including details like pipeline name, status, and run ID. This trigger will start the activity “send a email” which we will add in the next step. ![]() Once you have added the HTTP trigger ,click on the trigger and select method as POST and in the request body JSON schema you should specify the attributes that you will be receiving from datafactory HTTP POST request . We need name of pipeline , datafacory name ,run id , execution date , execution status , status logs . We should specify these attributes in JSON object format . JSON code is giving below {
"type": "object",
"properties": {
"PipelineName": {
"type": "string"
},
"DataFactoryName": {
"type": "string"
},
"Run ID": {
"type": "string"
},
"status": {
"type": "string"
},
"Logs": {
"type": "string"
}
}
}
Please Refer to below screenshot for more details
![]() Step 4:Adding Send Email actionAfter adding the trigger we need add some actions to be executed when that trigger is triggered.In our case the action is send a email .So click on the plus button the trigger button and you will be prompted with add an action popup , type send a email and you will be given a lot options , like Gmail , OutLook , Yahoo etc , select any one according to your convenience . I have selected a google email account. ![]() Once you have added send email action ,you need to sign into any one of the account of the domain you have selected. Give any connection name you want select Authentication type as Shared application and click on sign in.
![]() Step 5 : Writing Email Alert BodyOnce you have added the email action button , we need to add the recipients who want to receive the alerts on the pipeline and also need add the body which contained the entire details like Datafactory name , pipeline name , run_id , status , Logs .We can add the data into email subject and body by dynamically.Refer to below screenshot on how to add recipients. ![]() Adding recipients Now let us add the dynamic content to our mail that we ant to send , If you click on thunder symbol displaying on the left side pane , you will shown all the contents that we have received through HTTP POST Method.You can add the dynamic content in the body of mail also . Please refer to the Screens shots. ![]() Once you click on the dynamic content button you can see all the details like Datafactory name , pipeline name , run_id , status , Logs add them in any format that suits you.Please refer to below screen shot. ![]() Step 6 : Saving the Logic AppOnce you have made all the changes to the email subject and body , save the logic app and copy the HTTP URL generated for the future reference which we will be suing in the Azure Datafactory Pipeline .Please refer the below screenshot below Note : This is Important Step you should complusary save the logic app to generate the URL ![]() Step 7 : Azure Data Factory StepupNow you need to create the Datafoctory similar to Azure logic apps .In the Azure portal, click on “Create a resource” and search for “Data Factory.” Click “Create” and fill in the required details such as the Resource Group, Data Factory Name, and Region. Once completed, click “Create.” Navigate to your newly created Data Factory. Open the “Author & Monitor” pane to create a new pipeline. Add a Copy Data Activity: Drag a “Copy Data” activity onto the pipeline canvas. This activity will copy data from zone1 to zone2 within your Azure Data Lake.Please refer to below Screenshot. ![]() Step 8 :Configure Source and SinkConfigure Source: In the “Source” tab of the Copy Data activity, set up the source dataset to point to the container in zone1. Define the format and other necessary details. Configure Sink: In the “Sink” tab, set up the destination dataset to point to the container in zone2. Again, define the necessary format and configuration details.Please refer to the below screenshots. Source Settings ![]() Sink Settings ![]() Step 9 : Adding Web activityTo send notifications for pipeline activities, you will need to add Web activities at different stages of your pipeline to send HTTP POST requests to your Logic App. This Web activity will the information of pipeline to a HTTP URL via POST METHOD which is hosted in a Logic App we have created earlier.We are adding a Web activity at the end of your pipeline to notify that the pipeline has failed or succeeded.Please Look in the below ScreenShots
![]() Step 10 : Adding Web Activities for Success and FailureNow we need to make some changes to the web activity in order to complete its setup.
Please refer to the below screen shots ![]() If you observe in the JSON DOCUMENT we have wrote in logic app HTTP REQUEST trigger , we have variables like Data Factory Name ,Pipeline Name , Run_ID , Status and Logs.We we will be mentioning those variables here by clicking add dynamic content button mentioned in above step.Please refer to below screenshots. ![]() Please note that pipeline() is inbuilt parameter in datafactory which will be available for every pipeline , contains all information about the pipeline.Please refer to below code. {
"PipelineName" : "@{pipeline().Pipeline}",
"DataFactoryName" : "@{pipeline().DataFactory}",
"Run ID": "@{pipeline().RunId}",
"status": "@{activity('Copy Activity').Status}",
"Logs": "@{activity('Copy Activity').output.errors[0].Message}"
}
Step 11 : Executing the PipelineNow our entire set up is done.We will start executing the pipeline by clicking on the debug button and verify weather we are receiving alert mails are not.Please look into below screenshots. ![]() We ran our pipeline at 10:24 and upon success on pipeline we should receive a email that our pipeline is succeeded .Now If you check your email account which your mentioned in recievers list you should find a mail with success subject and body. ![]() Now you see the Zone2 of Azure datalake storage we should see the data which is copied from zone1 via pipeline execution.we will delete the source dataset in zone1 to make our pipeline fail so that we can check for failure case also.Please refer to below screenshot ![]() Executing Pipeline To check Failure CaseLet us now execute the pipeline again see how it works in failure case. Please refer to below case. ![]() Now our pipeline has failed , we should have received a email alert as of now. ![]() As you can we have received a email alert for failure along with log messages stating that our source fail is missing. ConclusionIn this guide, you set up a system that sends email notifications for your Azure Data Factory pipeline activities using Azure Logic Apps. This means you’ll get an email whenever your pipeline starts, finishes successfully, or fails, allowing you to keep a close eye on your data processes.By automating these notifications, you can quickly respond to any issues, ensuring your data tasks run smoothly. This setup shows how you can use Azure’s tools to make your work easier and more efficient, helping you manage your data better. As you get more comfortable with these tools, you can explore more ways to automate and improve your workflows. Send Failure & Success Pipeline Email via Logic Apps – FAQsCan I send notifications to multiple email addresses?
What are the prerequisites for setting up email notifications for Azure Data Factory (ADF) pipelines using Logic Apps?
How do I troubleshoot issues with email notifications?
|
Reffered: https://www.geeksforgeeks.org
DevOps |
Type: | Geek |
Category: | Coding |
Sub Category: | Tutorial |
Uploaded by: | Admin |
Views: | 14 |