Companies Home Search Profile

Databricks - Master Azure Databricks for Data Engineers

Focused View

Learning Journal,Prashant Kumar Pandey

17:33:26

9 View
  • 1. Course Prerequisites.mp4
    02:14
  • 2. About the Course.mp4
    05:07
  • 3. How to access Course Material and Resources.mp4
    11:51
  • 4.1 Capstone Project.zip
  • 4.2 Notebooks.zip
  • 4.3 SampleData.zip
  • 4. Note for Students - Before Start.mp4
    02:05
  • 1. Introduction to Data Engineering.mp4
    14:19
  • 2. Apache Spark to Data Engineering Platform.mp4
    12:45
  • 3.1 02-Introduction.pdf
  • 3. Introduction to Databricks Platform.mp4
    08:32
  • 1. What will you learn in this section.mp4
    02:38
  • 2. Creating Azure Cloud Account.mp4
    10:03
  • 3. Azure Portal Overview.mp4
    05:19
  • 4. Creating Databricks Workspace Service.mp4
    16:48
  • 5. Introduction to Databricks Workspace.mp4
    14:06
  • 6.1 03-Getting Started.pdf
  • 6. Azure Databricks Platform Architecture.mp4
    11:10
  • 1. What will you learn in this section.mp4
    02:10
  • 2. How to create Spark Cluster.mp4
    16:34
  • 3. Working with Databricks Notebook.mp4
    13:08
  • 4. Notebook Magic Commands.mp4
    07:48
  • 5.1 04-Working in Databricks Workspace.pdf
  • 5.2 ch4-working in databricks workspace.zip
  • 5. Databricks Utilities Package.mp4
    17:19
  • 1. What will you learn in this section.mp4
    01:56
  • 2. Introduction to DBFS.mp4
    03:26
  • 3. Working with DBFS Root.mp4
    09:02
  • 4.1 05-Working with Databricks File System - DBFS.pdf
  • 4.2 ch5-working with databricks file system - dbfs.zip
  • 4.3 people.zip
  • 4. Mounting ADLS to DBFS.mp4
    41:31
  • 1. What will you learn in this section.mp4
    02:44
  • 2. Introduction to Unity Catalog.mp4
    12:30
  • 3. Setup Unity Catalog.mp4
    30:04
  • 4. Unity Catalog User Provisioning.mp4
    13:38
  • 5.1 06-Setting up Unity Catalog.pdf
  • 5.2 ch6-working with unity catalog.zip
  • 5. Working with Securable Objects.mp4
    38:08
  • 1. What will you learn in this section.mp4
    01:51
  • 2. Introduction to Delta Lake.mp4
    06:25
  • 3. Creating Delta Table.mp4
    15:27
  • 4. Sharing data for External Delta Table.mp4
    12:48
  • 5. Reading Delta Table.mp4
    10:04
  • 6. Delta Table Operations.mp4
    21:58
  • 7. Delta Table Time Travel.mp4
    20:56
  • 8. Convert Parquet to Delta.mp4
    08:22
  • 9. Delta Table Schema Validation.mp4
    21:12
  • 10. Delta Table Schema Evolution.mp4
    28:56
  • 11. Look Inside Delta Table.mp4
    18:28
  • 12.1 07-Working with Delta Lake and Delta Tables.pdf
  • 12.2 CH7-Data Files.zip
  • 12.3 ch7-working with delta lake and delta tables.zip
  • 12. Delta Table Utilities and Optimization.mp4
    43:41
  • 1. What will you learn in this section.mp4
    01:14
  • 2. Architecture and Need for Incremental Ingestion.mp4
    06:05
  • 3. Using Copy Into with Manual Schema Evolution.mp4
    17:53
  • 4. Using Copy Into with Automatic Schema Evolution.mp4
    13:41
  • 5. Streaming Ingestion with Manual Schema Evolution.mp4
    09:43
  • 6. Streaming Ingestion with Automatic Schema Evolution.mp4
    08:55
  • 7. Introduction to Databricks Autoloader.mp4
    05:10
  • 8.1 08-Working with Databricks Incremental Ingestion Tools.pdf
  • 8.2 CH8-Data Files.zip
  • 8.3 ch8-working with databricks incremental ingestion tools.zip
  • 8. Autoloader with Automatic Schema Evolution.mp4
    31:54
  • 1. What will you learn in this section.mp4
    01:31
  • 2. Introduction to Databricks DLT.mp4
    06:21
  • 3. Understand DLT Use Case Scenario.mp4
    10:33
  • 4. Setup DLT Scenario Dataset.mp4
    06:06
  • 5. Creating DLT Workload in SQL.mp4
    45:44
  • 6. Creating DLT Pipeline for your Workload.mp4
    19:41
  • 7.1 09-Working with Databricks Delta Live Tables (DLT).pdf
  • 7.2 CH9-Data Files.zip
  • 7.3 ch9-working with databricks delta live tables (dlt).zip
  • 7. Creating DLT Workload in Python.mp4
    46:57
  • 1. What will you learn in this section.mp4
    01:55
  • 2. Working with Databricks Repos.mp4
    21:19
  • 3. Working with Databricks Workflows.mp4
    15:02
  • 4. Working with Databricks Rest API.mp4
    18:45
  • 5.1 10-Databricks Project and Automation Features.pdf
  • 5.2 CH10-Data Files.zip
  • 5.3 ch10-databricks project and automation features.zip
  • 5.4 Deploy.zip
  • 5. Working with Databricks CLI.mp4
    12:22
  • 1. Project Scope and Background.mp4
    08:48
  • 2. Taking out the operational requirement.mp4
    03:56
  • 3. Storage Design.mp4
    06:07
  • 4. Implement Data Security.mp4
    04:55
  • 5. Implement Resource Policies.mp4
    01:56
  • 6. Decouple Data Ingestion.mp4
    03:55
  • 7. Design Bronze Layer.mp4
    03:25
  • 8. Design Silver and Gold Layer.mp4
    03:44
  • 9. Setup your environment.mp4
    02:52
  • 10. Create a workspace.mp4
    07:23
  • 11. Create and Storage Layer.mp4
    08:19
  • 12. Setup Unity Catalog.mp4
    06:46
  • 13. Create Metadata Catalog and External Locations.mp4
    09:35
  • 14. Setup your source control.mp4
    03:50
  • 15. Start Coding.mp4
    10:34
  • 16. Test your code.mp4
    12:28
  • 17. Load historical data.mp4
    05:16
  • 18. Ingest into bronze layer.mp4
    08:40
  • 19. Process the silver layer.mp4
    12:27
  • 20. Handling multiple updates.mp4
    05:17
  • 21. Implementing Gold Layer.mp4
    05:41
  • 22. Creating a run script.mp4
    08:47
  • 23. Preparing for Integration testing.mp4
    03:49
  • 24. Creating Test Data Producer.mp4
    02:45
  • 25. Creating Integration Test for Batch mode.mp4
    06:11
  • 26. Creating Integration Test for Stream mode.mp4
    17:42
  • 27. Implementing CI CD Pipeline.mp4
    09:49
  • 28. Develop Build Pipeline.mp4
    07:54
  • 29. Develop Release Pipeline.mp4
    17:38
  • 30.1 Data Set.zip
  • 30.2 Notebooks.zip
  • 30.3 Other Code.zip
  • 30. Creating Databricks CLI Script.mp4
    04:00
  • 1. Congratulations.mp4
    01:03
  • 2. Keep Learning and Keep Growing.html
  • Description


    Learn Azure Databricks for professional data engineers using PySpark and Spark SQL

    What You'll Learn?


    • Databricks in Azure Cloud
    • Working with DBFS and Mounting Storage
    • Unity Catalog - Configuring and Working
    • Unity Catalog User Provisioning and Security
    • Working with Delta Lake and Delta Tables
    • Manual and Automatic Schema Evolution
    • Incremental Ingestion into Lakehouse
    • Databricks Autoloader
    • Delta Live Tables and DLT Pipelines
    • Databricks Repos and Databricks Workflow
    • Databricks Rest API and CLI
    • Capstone Project

    Who is this for?


  • Data Engineers
  • Data Engineering Solution Architects
  • What You Need to Know?


  • Python Programming Language
  • Apache Spark and Dataframe APIs using Python
  • Spark Structured Streaming APIs using Python
  • More details


    Description

    About the Course

    I am creating Databricks - Master Azure Databricks for Data Engineers using the Azure cloud platform. This course will help you learn the following things.


    1. Databricks in Azure Cloud

    2. Working with DBFS and Mounting Storage

    3. Unity Catalog - Configuring and Working

    4. Unity Catalog User Provisioning and Security

    5. Working with Delta Lake and Delta Tables

    6. Manual and Automatic Schema Evolution

    7. Incremental Ingestion into Lakehouse

    8. Databricks Autoloader

    9. Delta Live Tables and DLT Pipelines

    10. Databricks Repos and Databricks Workflow

    11. Databricks Rest API and CLI

    Capstone Project

    This course also includes an End-To-End Capstone project. The project will help you understand the real-life project design, coding, implementation, testing, and CI/CD approach.

    Who should take this Course?

    I designed this course for data engineers who are willing to develop Lakehouse projects following the Medallion architecture approach using the Databrick cloud platform. I am also creating this course for data and solution architects responsible for designing and building the organization’s Lakehouse platform infrastructure. Another group of people is the managers and architects who do not directly work with Lakehouse implementation. Still, they work with those implementing Lakehouse at the ground level.

    Spark Version used in the Course.

    This course uses Databricks in Azure Cloud and Apache Spark 3.5. I have tested all the source codes and examples used in this course on Azure Databricks Cloud using Databricks Runtime 13.3.

    Who this course is for:

    • Data Engineers
    • Data Engineering Solution Architects

    User Reviews
    Rating
    0
    0
    0
    0
    0
    average 0
    Total votes0
    Focused display
    Learning Journal
    Learning Journal
    Instructor's Courses
    Learning Journal is a small team of people passionate about helping others learn and grow in their careers by bridging the gap between their existing and required skills. In our quest to fulfill this mission, we are authoring books, publishing technical articles, and creating training videos to help IT professionals and students succeed in the industry.Together we have over 40+ years of experience in IT as a developer, architect, consultant, trainer, and mentor. We have worked with international software services organizations on various data-centric and Bigdata projects.Learning Journal is a team of firm believers in lifelong continuous learning and skill development. To popularize the importance of lifelong continuous learning, we started publishing free training videos on our YouTube channel. We conceptualized the notion of continuous learning, creating a journal of our learning under the Learning Journal banner.We authored various skill development courses, training, and technical articles since the beginning of the year 2018.
    Prashant Kumar Pandey
    Prashant Kumar Pandey
    Instructor's Courses
    Prashant Kumar Pandey is passionate about helping people to learn and grow in their career by bridging the gap between their existing and required skills. In his quest to fulfill this mission, he is authoring books, publishing technical articles, and creating training videos to help IT professionals and students succeed in the industry.With over 18 years of experience in IT as a developer, architect, consultant, trainer, and mentor, he has worked with international software services organizations on various data-centric and Bigdata projects.Prashant is a firm believer in lifelong continuous learning and skill development. To popularize the importance of lifelong continuous learning, he started publishing free training videos on his YouTube channel and conceptualized the idea of creating a Journal of his learning under the banner of Learning Journal.He is the founder, lead author, and chief editor of the Learning Journal portal that offers various skill development courses, training, and technical articles since the beginning of the year 2018.
    Students take courses primarily to improve job-related skills.Some courses generate credit toward technical certification. Udemy has made a special effort to attract corporate trainers seeking to create coursework for employees of their company.
    • language english
    • Training sessions 90
    • duration 17:33:26
    • Release Date 2024/03/03