Companies Home Search Profile

Azure Data Factory for Beginners - Build Data Ingestion

Focused View

David Mngadi

6:29:18

80 View
  • 001. Introduction to the Course.mp4
    05:05
  • 002. Introduction to ADF (Azure Data Factory).mp4
    03:43
  • 003. Requirements Discussion and Technical Architecture.mp4
    02:09
  • 004. Register a Free Azure Account.mp4
    04:14
  • 005. Create a Data Factory Resource.mp4
    08:39
  • 006. Create a Storage Account and Upload Data.mp4
    07:32
  • 007. Create Data Lake Gen 2 Storage Account.mp4
    05:51
  • 008. Download Storage Explorer.mp4
    04:28
  • 009. Create Your First Azure Pipeline.mp4
    16:28
  • 010. Closing Remarks.mp4
    06:30
  • 001. Introduction to Metadata-Driven Ingestion.mp4
    03:18
  • 002. High-Level Plan.mp4
    06:55
  • 003. Create Active Directory User.mp4
    02:31
  • 004. Assign the Contributor Role to the User.mp4
    03:35
  • 005. Disable Security Defaults.mp4
    01:49
  • 006. Creating the Metadata Database.mp4
    09:30
  • 007. Install Azure Data Studio.mp4
    05:54
  • 008. Create Metadata Tables and Stored Procedures.mp4
    08:14
  • 009. Reconfigure Existing Data Factory Artifacts.mp4
    07:09
  • 010. Set Up Logic App to Handle Email Notifications.mp4
    09:24
  • 011. Modify the Data Factory Pipeline to Send an Email Notification.mp4
    10:16
  • 012. Create Linked Service for Metadata Database and Email Dataset.mp4
    04:07
  • 013. Create Utility Pipeline to Send Email Notifications.mp4
    14:43
  • 014. Explaining the Email Recipients Table.mp4
    05:22
  • 015. Explaining the Get Email Addresses Stored Procedure.mp4
    02:30
  • 016. Modify Ingestion Pipeline to Use the Email Utility Pipeline.mp4
    04:40
  • 017. Tracking the Triggered Pipeline.mp4
    12:29
  • 018. Making the Email Notifications Dynamic.mp4
    16:52
  • 019. Making Logging of Pipeline Information Dynamic.mp4
    10:52
  • 020. Add a New Way to Log the Main Ingestion Pipeline.mp4
    13:28
  • 021. Change the Logging of Pipelines to Send Fail Message Only.mp4
    08:06
  • 022. Creating Dynamic Datasets.mp4
    11:25
  • 023. Reading from Source to Target - Part 1.mp4
    08:09
  • 024. Reading from Source to Target - Part 2.mp4
    12:53
  • 025. Explaining the Source to Target Stored Procedure.mp4
    04:48
  • 026. Add Orchestration Pipeline - Part 1.mp4
    07:10
  • 027. Add Orchestration Pipeline - Part 2.mp4
    09:02
  • 028. Fixing the Duplicating Batch Ingestions.mp4
    08:19
  • 029. Understanding the Pipeline Log and Related Tables.mp4
    09:33
  • 030. Understanding the GetBatch Stored Procedure.mp4
    04:59
  • 031. Understanding the Set Batch Status and GetRunID.mp4
    03:33
  • 032. Setting Up an Azure DevOps Git Repository.mp4
    06:39
  • 033. Publishing the Data Factory to Azure DevOps.mp4
    08:20
  • 034. Closing Remarks.mp4
    02:42
  • 001. Introduction.mp4
    03:05
  • 002. Read from Azure Storage Plan.mp4
    01:20
  • 003. Create Finance Container and Upload Files.mp4
    02:14
  • 004. Create Source Dataset.mp4
    06:14
  • 005. Write to Data Lake - Raw Plan.mp4
    02:33
  • 006. Create Finance Container and Directories.mp4
    02:39
  • 007. Create Sink Dataset.mp4
    04:35
  • 008. Data Factory Pipeline Plan.mp4
    01:45
  • 009. Create Data Factory and Read Metadata.mp4
    06:19
  • 010. Add Filter by CSV.mp4
    05:03
  • 011. Add Dataset to Read Files.mp4
    02:50
  • 012. Add the For Each CSV File Activity and Test Ingestion.mp4
    08:37
  • 013. Adding the Event-Based Trigger Plan.mp4
    01:18
  • 014. Enable the Event Grid Provider.mp4
    02:00
  • 015. Delete File and Add Event-Based Trigger.mp4
    01:47
  • 016. Create Event-Based Trigger.mp4
    04:05
  • 017. Publish Code to Main Branch and Start Trigger.mp4
    03:09
  • 018. Trigger Event-Based Ingestion.mp4
    03:26
  • 019. Closing Remarks.mp4
    02:24
  • Description


    Building frameworks is now an industry norm and it has become an important skill to know how to visualize, design, plan, and implement data frameworks. The framework that we are going to build together is the Metadata-Driven Ingestion Framework. Metadata-driven frameworks allow a company to develop the system just once and it can be adopted and reused by various business clusters without the need for additional development, thus saving the business time and costs. Think of it as a plug-and-play system.

    The first objective of the course is to onboard you onto the Azure Data Factory platform to help you assemble your first Azure Data Factory pipeline. Once you get a good grip on the Azure Data Factory development pattern, then it becomes easier to adopt the same pattern to onboard other sources and data sinks.

    Once you are comfortable with building a basic Azure Data Factory pipeline, as a second objective, we then move on to building a fully-fledged and working metadata-driven framework to make the ingestion more dynamic; furthermore, we will build the framework in such a way that you can audit every batch orchestration and individual pipeline runs for business intelligence and operational monitoring.

    By the end of this course, you will be able to design, implement, and get production-ready for data ingestion in Azure.

    All the resource files are added to the GitHub repository at: https://github.com/PacktPublishing/Azure-Data-Factory-for-Beginners---Build-Data-Ingestion

    More details


    User Reviews
    Rating
    0
    0
    0
    0
    0
    average 0
    Total votes0
    Focused display
    Category
    David Mngadi
    David Mngadi
    Instructor's Courses
    David Mngadi is a data management professional who is influenced by the power of data in our lives and has helped several companies become more data-driven to gain a competitive edge as well as meet the regulatory requirements. In the last 15 years, he has had the pleasure of designing and implementing data warehousing solutions in retail, telco, and banking industries, and recently in more big data lake-specific implementations. He is passionate about technology and teaching programming online.
    Packt is a publishing company founded in 2003 headquartered in Birmingham, UK, with offices in Mumbai, India. Packt primarily publishes print and electronic books and videos relating to information technology, including programming, web design, data analysis and hardware.
    • language english
    • Training sessions 63
    • duration 6:29:18
    • Release Date 2023/02/06